#How to deindex your page through webmaster
Explore tagged Tumblr posts
techiegigs · 5 months ago
Text
How to Identify and Disavow Toxic Backlinks
Backlinks play a crucial role in SEO by helping search engines understand the authority and relevance of your website. However, not all backlinks are beneficial. Toxic backlinks—links from spammy, low-quality, or unrelated websites—can harm your site's search rankings and credibility.
In this blog, we'll explore how to identify toxic backlinks and effectively disavow them to safeguard your website's SEO health.
Tumblr media
What Are Toxic Backlinks?
Toxic backlinks are links from websites that violate Google’s Webmaster Guidelines. These links are typically:
From spammy, irrelevant, or low-authority domains.
Created using link schemes or purchased links.
Embedded within content stuffed with keywords or spun text.
Associated with harmful activities like hacking or malware.
If left unchecked, toxic backlinks can lead to penalties or even deindexing by search engines.
Step 1: Identify Toxic Backlinks
The first step is to find and assess the quality of your backlinks. Here's how:
1. Use Backlink Analysis Tools
Tools like Google Search Console, Ahrefs, SEMrush, or Moz Link Explorer can help identify backlinks pointing to your site. Export a list of your backlinks for analysis.
2. Check for Key Indicators of Toxic Links
Examine each backlink for signs of toxicity:
Low Domain Authority: Links from sites with little to no authority.
Irrelevant Content: Links from sites unrelated to your niche or industry.
Anchor Text Spam: Over-optimized or irrelevant anchor text.
Link Farms or PBNs: Links from sites solely created for link-building purposes.
Foreign or Suspicious Domains: Links from unknown or foreign domains not aligned with your audience.
3. Conduct a Manual Review
While tools provide valuable data, a manual review ensures you don’t mistakenly disavow legitimate links. Check the linking site’s content quality, relevance, and reputation.
Step 2: Create a List of Toxic Backlinks
After identifying toxic links, compile a list of URLs and domains you want to disavow.
Organize Your List
Create two categories:
URLs of specific pages with toxic links.
Entire domains that consistently link to your site with harmful intent.
Tools like Google Sheets or Excel can help you manage this data.
Step 3: Attempt to Remove the Links
Before disavowing, try to remove the toxic backlinks manually:
Contact Website Owners: Reach out to the owners of the sites linking to you. Politely request the removal of the harmful link.
Use Contact Information: Locate their email or use their contact forms. Be professional and concise.
If the website owner is unresponsive or unwilling, proceed to the disavowal step.
Step 4: Disavow Toxic Backlinks
Google’s Disavow Tool allows you to inform the search engine to ignore specific backlinks. Here’s how:
1. Create a Disavow File
The file must be a plain-text document (.txt) containing:
One URL per line: For specific backlinks.
Use the domain: directive for entire domains.
Example:
http://example.com/spammy-page domain:toxicdomain.com
2. Upload the File to Google
Go to the Google Disavow Tool.
Select your property.
Upload the disavow file and confirm the submission.
3. Wait for Google to Process
Google will process your request in its next crawl cycle. It may take a few weeks to see results.
Best Practices for Handling Toxic Backlinks
Monitor Your Backlinks Regularly Use tools like Ahrefs Alerts or Google Search Console to keep track of new backlinks.
Avoid Black Hat SEO Practices Refrain from buying links or using link schemes, as they often result in toxic backlinks.
Build High-Quality Backlinks Focus on earning backlinks from reputable and relevant websites through:
Guest blogging.
High-quality content creation.
Outreach campaigns.
Audit Backlinks Periodically Conduct regular backlink audits to spot and address harmful links early.
Conclusion
Toxic backlinks can undermine your SEO efforts and damage your site’s reputation. By identifying and disavowing harmful links, you can protect your site from penalties and maintain a strong online presence. Regular backlink monitoring, combined with ethical link-building practices, ensures your SEO strategy remains effective and future-proof.
Take control of your backlinks today and ensure your website stays on the path to SEO success!
0 notes
jasminadiaa · 5 years ago
Link
We are provide a details of How to deindex your page through webmaster,There are several ways to remove pages from Google’s index – changing the status to 404 and allowing the page to drop naturally or adding a noindex meta tag to your page are some solutions – but they aren’t always immediate,  Follow easy step for How to deindex your page through webmaster.
0 notes
kevinalevine · 6 years ago
Text
Thin Content & SEO | How to Avoid a Google Thin Content Penalty
We live in a world of information overload. If 10 years ago it was hard to find content at all, now there’s way too much of it! Which one is good? Which one is bad? We don’t know.
  While this subject is very complex, it’s clear that Google is attempting to solve these content issues in its search results. One of the biggest issues they’ve encountered in the digital marketing world is what they call thin content.
    But what exactly is thin content? Should you worry about it? Can it affect your website’s SEO in a negative way? Well, thin content can get your site manually penalized but it can also sometimes send your website in Google’s omitted results. If you want to avoid these issues, keep reading!
  What Is Thin Content & How Does It Affect SEO?
Is Thin Content Still a Problem in 2019?
How Does Thin Content Affect SEO?
Where Is Thin Content Found Most Often?
How to Identify Thin Content Pages
How to Fix Thin Content Issues & Avoid a Google Penalty
Make sure your site looks legit
Add more content & avoid similar titles
Don’t copy content
Web design, formatting & ads
Video, images, text, audio, etc.
Deindex/remove useless pages
  1. What Is Thin Content & How Does It Affect SEO?
  Thin content is an OnPage SEO issue that has been defined by Google as content with no added value.
  When you’re publishing content on your website and it doesn’t improve the quality of a search results page at least a little bit, you’re publishing thin content.
  For a very dull example, when you search Google for a question such as “What color is the sky?” and there’s an article out there saying “The sky is blue!”, if you publish an article with the same answer you would be guilty of adding no value.
  So does it mean that this article is thin content because there are other articles about thin content out there?
  Well.. no. Why? Because I’m adding value to it. First, I’m adding my own opinion, which is crucial. Then, I’m trying to structure it as logically as possible, address as many important issues as I can and cover gaps which I have identified from other pieces.
  Sometimes, you might not have something new to say, but you might have a better way of saying it. To go back to our example, you could say something like “The sky doesn’t really have a color but is perceived as blue by the human eye because of the way light scatters through the atmosphere.”
  Of course, you would probably have to add at least another 1500 words to that to make it seem like it’s not thin. It’s true. Longer content tends to rank better in Google, with top positions averaging about 2000 words.
  How your content should be to rank
  Sometimes, you might add value through design or maybe even through a faster website. There are multiple ways through which you can add value. We’ll talk about them soon.
  From the Google Webmaster Guidelines page we can extract 4 types of practices which are strictly related to content quality. However, they are not easy to define!
  Automatically generated content: Simple. It’s content created by robots to replace regular content, written by humans. Don’t do it. But… some AI content marketing tools have become so advanced that it’s hard to distinguish between real and automatically generated content. Humans can write poorly too. Don’t expect a cheap freelancer who writes 1000 words for $1 to have good grammar and copy. A robot might be better. But theoretically, that’s against the rules.
Thin affiliate pages: If you’re publishing affiliate pages which don’t include reviews or opinions, you’re not providing any new value to the users compared to what the actual store is already providing on their sales page.
Scraped or copied content: The catch here is to have original content. If you don’t have original content, you shouldn’t be posting it to claim it’s yours. However, even when you don’t claim it’s yours, you can’t expect Google to rank it better than the original source. Maybe there can be a reason (better design, faster website) but, generally, nobody would say it’s fair. Scraping is a no no and Google really hates it.
Doorway pages: Doorway pages are pages created to target and rank for a variety of very similar queries. While this is bad in Google’s eyes, the search giant doesn’t provide an alternative to doorway pages. If you have to target 5-10 similar queries (let’s say if you’re doing local SEO for a client), you might pull something off with one page, but if you have to target thousands of similar queries, you won’t be able to do it. A national car rental service, for example, will always have pages which could be considered doorways.
  If you want, you can listen to Matt Cutts’ explanation from this video.
youtube
    As you can see, it all revolves around value. The content that you publish must have some value to the user. If it’s just there because you want traffic, then you’re doing it wrong.
  But value can sometimes be hard to define. For some, their content might seem as the most valuable, while for others it might seem useless. For example, one might write “Plumbing services New York, $35 / hour, Phone number”. The other might write “The entire history of plumbing, How to do it yourself, Plumbing services New York, $35 / hour, Phone number.”
  Which one is more relevant? Which one provides more value? It really depends on the user’s intent. If the user just wants a plumber, they don’t want to hear about all the history. They just want a phone number and a quick, good service.
  However, what’s important to understand is that there is always a way to add value.
  In the end, it’s the search engine that decides, but there are some guidelines you can follow to make sure Google sees your content as valuable. Keep reading and you’ll find out all about them. But first, let’s better understand why thin content is still an issue and how it actually affects search engine optimization.
  1.1 Is Thin Content Still a Problem in 2019?
  The thin content purge started on February 23, 2011 with the first Panda Update. At first, Google introduced the thin content penalty because many people were generating content automatically or were creating thousands of irrelevant pages.
  The series of further updates were successful and many websites with low quality content got penalized or deranked. This pushed site owners to write better content.
  Unfortunately, today this mostly translates to longer content. The more you write, the more value you can provide, right? We know it’s not necessarily the case, but as I’ve said, longer content does tend to rank better in Google. Be it because the content makes its way up there or because the search engine is biased towards it… it’s hard to tell.
  But there’s also evidence that long form content gets more shares on social media. This can result in more backlinks, which translates to better rankings. So it’s not directly the fact that the content is long, but rather an indirect factor related to it.
  It’s kind of ironic, as Google sometimes uses its answer boxes to give a very ‘thin’ answer to questions that might require more context to be well understood.
  However, in 2019 it’s common SEO knowledge that content must be of high quality. The issue today shifts to the overload of content that is constantly being published. Everything is, at least to some extent, qualitative.
  But it’s hard to get all the information from everywhere and you don’t always know which source to rely on or trust. That’s why content curation has been doing so well lately.
  This manifests itself in other areas, especially where there’s a very tough competition, such as eCommerce.
  1.2 How Does Thin Content Affect SEO?
  Google wants to serve its users the best possible content it can. If Google doesn’t do that, then its users won’t return to Google and could classify it as a poor quality service. And that makes the search engine unhappy.
  Google generally applies a manual action penalty to websites it considers to contain thin content. You will see it in the Google Search Console (former Google Webmaster Tools) and it looks like this:
  However, your site can still be affected by thin content even if you don’t get a warning from Google in your Search Console account. That’s because you’re diluting your site’s value and burning through your crawl budget.
  The problem that search engines have is that they constantly have to crawl a lot of pages. The more pages you give it to crawl, the more work it has to do.
  If the pages the search engine crawls are not useful for the users, then Google will have a problem with wasting its time on your content.
  1.3 Where Is Thin Content Found Most Often?
  Thin content is found most of the time on bigger websites. For the sake of helping people that really need help, let’s exclude spammy affiliate websites and automated blogs from this list.
  Big websites, like eCommerce stores, often have a hard time coming up with original, high quality content for all their pages, especially for thousands of product pages.
    In the example above, you can see that although the Product Details section under the image is expanded, there’s no content there. This means that users don’t have any details at all about the dress. All they know is that it’s a dress, it’s black and it costs about $20.
  This doesn’t look too bad when you’re looking as a human at a single page, but when you’re a search engine and take a look at thousands and thousands of pages just like this one, then you begin to see the issue.
  The solution here is to add some copy. Think of what users want to know about your product. Make sure you add the details about everything they might want to know and make them easily accessible!
  Sometimes, thin content makes its way into eCommerce sites unnoticed. For example, you might have a category page which hosts a single product. Compared to all your other categories or competitor websites,that can be seen as thin content.
  2. How to Identify Thin Content Pages
  If we are referring merely to its size, then thin content can be easily identified using the cognitiveSEO Tool’s Site Audit.
  Did you know?
Identifying thin content is actually really easy with a tool like cognitiveSEO Site Audit. The tool has a Thin Content section where you can easily find the pages with issues.
It’s as simple as that! Once you have your list, you can export it and start adding some content to those pages. This will improve their chances to make it to the top of the search results.
  However, you also want to take a look at the duplicate content section in the Site Audit tool. This can also lead to a lot of indexation & content issues.
    Extremely similar pages can be “combined” using canonical tags. Sometimes it can be a good idea to remove them completely from the search engine results.
  3. How to Fix Thin Content Issues & Avoid a Google Penalty
  Sometimes, you can fix thin content issues easily, especially if you get a manual penalty warning. At least if your website isn’t huge. If you have thousands of pages, it might take a while till you can fix them.
  Here’ s a happy ending case from one of Doug Cunnington’s students:
youtube
    However, the “penalty” can also come from the algorithm and you won’t even know it’s there because there is no warning. It’s not actually a penalty, it’s just the fact that Google won’t rank your pages because of their poor quality.
  When that’s the case, it might not be as easy to get things fixed as in the video above.
  In order to avoid getting these penalties, here’s a few things that you should consider when you write content.
  3.1 Make sure your site looks legit
  First of all, if your website looks shady, then you have a higher chance of getting a manual penalty on your website. If someone from Google reviews your website and decides it looks spammy at a first glance, they will be more likely to consider penalizing it.
  To avoid this, make sure you:
  Use an original template and customize it a little bit
Have a logo or some sort of original branding
Provide an about page and contact details
  3.2 Add more content & avoid very similar titles
  The best way to show Google that your pages are worth taking a look at is to not leave them empty. In 2019, I hope we all know that for good OnPage SEO we need to add a little bit more content.
  Your pages should have at least 300 words of copy. Notice how I say copy, not words. If you’re there to sell, write copy. Even on an eCommerce product page.
  If you’re not sure what to write about, you can always use the CognitiveSEO Keyword Tool & Content Assistant. It will give you ideas on what you should write on your pages to make them relevant for the query you want them to rank on.
  Automatically generated titles can also quickly trigger Google’s alarms. If you review multiple products from the same brand and your titles are like this:
  Nike Air Max 520 Review
Nike Air Max 620 Review
Nike Air Max 720 Review
  then you can see how it might be an issue. Do those articles provide any value or are they all the same except for one digit?
  It’s important to have the keywords in your title, but you can also try to add some diversity to them. It’s not always very hard to do. A good example could be:
  Nike Air Max 520 Review | Best bang for the buck
Nike Air Max 620 | A Comprehensive Review Regarding Comfort
Nike Air Max 720 | Review After 2 Weeks of Wearing Them at The Gym
  But Adrian, I have an eCommerce site with over 2000 products, I can’t write original titles for all of them!
  That’s why I said that content isn’t the only way you can provide value with. If you can’t change the titles and content, improve some other areas.
  However, the truth is that there’s someone out there who does optimize and show love to all their titles, even if there are 2000 of them. So why shouldn’t they be rewarded for it?
  Usually, very similar titles are a result of content duplication issues. If you have a product that comes in 100 different colors, you don’t necessarily need to have 100 different pages with 100 unique titles and copy. You can just make them 1 single page where users can select their color without having to go to another URL.
  Combining pages can also be done via canonical tags, although it’s recommended to only keep this for duplicate content. Pages with different colors can count as duplicate content, as only one word is different, so the similarity is 99.9%.
  Make sure that the pages that get canonicalized don’t provide organic search traffic. For example, if people search for “blue dress for ladies” then it’s a good idea to have a separate page that can directly rank for that query instead of canonicalizing it to the black version.
  A proper faceted navigation can help you solve all these SEO issues.
  3.3 Don’t copy content
  Copying content from other websites will definitely make your site look bad in Google’s eyes.
  Again, this happens mostly on eCommerce websites, where editors get the descriptions directly from the producer’s official website. Many times they also duplicate pages in order to save time and just change a couple of words.
  On the long run, this will definitely get you into duplicate content issues, which can become very hard to fix once they’re out of control. It will also tell Google that your site endorses competitors. By using their copy, you’re considering it valuable, right?
  3.4 Web design, formatting & ads
  Sometimes, you can identify gaps in web design or formatting. That’s not easy to do, as you’ll have to manually take a look at your competitor’s websites. Here are some questions you should ask yourself:
  Are competitors presenting their information in an unpleasant manner? Do they have too many pop-ups, too many ads or very nasty designs?
  Then that’s obviously where you can make a difference. This doesn’t give you the right not to have an original copy, but it might have a greater impact.
  Source: premiumcoding.com
  3.5 Video, images, text, audio, etc.
  Big, successful eCommerce businesses which have an entire community supporting them and backing them up have used this technique for a long time: video content.
  This might work better in some niches, such as tech. In Romania, cel.ro has a very bad reputation with delivery and quality, yet it still has a decent amount of market share due to its strong video content marketing strategy.
  If you want to improve the value of your page, make sure you add images, videos or whatever you think might better serve your user. If you’re a fashion store, images might be your priority, while if you’re an electronics store, the product specifications should be more visible instead.
  3.6 Deindex useless pages
  Sometimes, when you have a lot of very similar pages that host thin content with no added value, the only viable solution is to remove those pages completely.
  This can be done in a number of ways. However, the best ones are:
  Removing the content altogether
Using canonical tags to combine them
Using robots.txt & noindex
  However, you’ll have to choose carefully which method you use. Remember, you don’t want to remove those pages with search demand from the search engines!
  Source: Moz.com
  This can determine you to switch the focus from optimizing individual product pages to optimizing category pages.
  Conclusion
  Thin content is definitely bad for your website. It’s always better to avoid an issue from the beginning than to have to fix it later on. This saves you both time and money.
  However, you’ll have to know about these issues early on, before you even start setting up your website and content marketing strategy. Hopefully, this article helped you have a better understanding on the topic.
  Have you ever faced thin content issues on your websites in your digital marketing journey? How do you identify it? And how did you solve these content issues? Let us know in the comments section below!
The post Thin Content & SEO | How to Avoid a Google Thin Content Penalty appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.
Thin Content & SEO | How to Avoid a Google Thin Content Penalty published first on http://nickpontemarketing.tumblr.com/
0 notes
codword · 6 years ago
Text
How to get your website ranked on Google First Page? Tips Unveiled
As everyone knows that ranking the website on the first page of Google is quite essential. The fight is always for the first five spots as getting ranked after that is just a formality. 
Business entrepreneurs hire a digital marketing team to acquire best SEO services for their brand website which can bring fruitful results, but only 10 websites are placed on the first page of Google search rankings out of which first 5 are given top priority if their services match the requirement.
Someone has rightly said “The best place to hide a dead body of any person can     be the second page of Google search results”
This is the truth as 85% do not turn over to the next page of Google search results and that’s the only reason, SEO services expert bring his/her experience into action for keeping the ranking of business keywords within top 5 spots on the first page of apex search engine. 
Though there are many websites indexed on Google but very few are able to make their presence on the first page of search results. Those who fail to do so keep trying hard ignoring the loopholes which are not letting them come on the top of search results.
At times, the top experts at SEO company in USA, fail to do the assigned task as they are not able to rank the website on the first page of Google search results. Some professionals actually type queries to get relevant answers so as to work on the issues and try getting the website ranked with business keywords. 
Some of those are as follows:
How to get to the top of Google search results?
How to get your website on Google first page?
How to make a website appear first in Google search?
How to rank my website at top on Google?
You cannot question their caliber as they have tried their best to get the website ranked on top. The day all the limitations in the on-page are fixed and off-page SEO services are performed up to the mark.
Learn How SEO helps a website rank on the first page of Google Search Results 
Know about the Google search algorithm
Google algorithm plays a vital part in ranking the website on different pages. It remains constant as the protocols keep changing time by time to get spammy sites de-indexed from the search results. 
Check your business keyword ranking
Keep checking the business keyword ranking on a regular basis by placing the same on Google searches. If you find the keywords not relevant to the service you’re providing then use the targeted keywords that fit best in content and meta tags. This attribute should be taken care of in a proper way.  If you are looking for a convenient way to find the ranking of your business keywords then opt for serps.com to the same as it is the easiest and reliable way to drive ranking results in quick time.
Track and measure the right metrics
 It’s quite vital to check the metrics of your website to find everything is running in a proper state and the factors that need improvement. Metrics such as organic traffic and conversions, commercial keywords ranking can be done through the Google webmaster tool as it is reliable and deliver informative results that are easily understandable. 
Make sure your website is mobile-friendly
As everything is getting digital these days, it is required that your website should be completely responsive and mobile-friendly so that audience who visit the website through Smartphone device can go through the information without any hassle and get converted into a customer. Also, check the web page loading time and get the issues rectified which are responsible to display the webpage on the device.
Diagnose and fix existing issues in website
It’s essential to diagnose the website to check if it is penalized as it can be the main reason behind website not getting ranked on top with business keywords in spite of many efforts. You have to create unique backlinks that are active and relevant as your website niche. The mix of no-follow and do-follow backlinks are recommended as per Google Panda update as the only emphasis on creating only do-follow links will become under the scanner
Consult with an SEO company in USA regarding this issue if you’re not able to do this yourself. It’s recommended to be done by experts as they have the knowledge to fix the issues with perfection.
Perform Keyword Research
Keyword research is essential in on-page SEO process as the targeted long-tail keywords are generally used by people to create a query on a specific topic. For example 
“How to book flight tickets for traveling to Las Vegas” 
Prefer low and medium density keywords if you are new Setup Company with how-to questions which are in query forms as Google prefer user-friendly contents as the search engine is developed for web users help only to find the relevant answer for the queries.
Make on-page SEO effective
Optimizing website titles and description by adding business keywords, schema mark-up as well as creating sitemaps and rectifying Google crawl errors through webmaster tool makes the on-page SEO effective.
Draft a user-friendly SEO Content for your website
Content is said to be the king these days in digital marketing and is given top priority amidst other protocols which Google considers on a serious note while assigning a rank to the website. The readability errors will not be entertained by Google as it will dump the website to the 7-8th search page or even deindex the same. The top-quality content on your website with perfect keyword placement as per the prescribed density has a chance to get reflected in the top ranking of the first page of Google search ranking. 
Bring Social networks in usage
Social network websites such as Facebook, Instagram and LinkedIn are being used on a large scale to promote or advertise the website. The proper keywords or hashtags being used in the message with captivating images certainly derive the huge traffic towards your business website which will eventually help you in the long run.
A closing note
There are various digital marketing companies in the United States that deliver best SEO services to help you rank the website on the top spot. In case you’ve no idea how to get your website on google first page with perfection, it’s highly recommended to get the avail services offered by digital marketing experts as their expertise can actually help you deliver best results in the business.
Read More Blog:- Artificial Intelligence Apps Going to Bring A Revolution in Technology
0 notes
philipfloyd · 6 years ago
Text
Thin Content & SEO | How to Avoid a Google Thin Content Penalty
We live in a world of information overload. If 10 years ago it was hard to find content at all, now there’s way too much of it! Which one is good? Which one is bad? We don’t know.
  While this subject is very complex, it’s clear that Google is attempting to solve these content issues in its search results. One of the biggest issues they’ve encountered in the digital marketing world is what they call thin content.
    But what exactly is thin content? Should you worry about it? Can it affect your website’s SEO in a negative way? Well, thin content can get your site manually penalized but it can also sometimes send your website in Google’s omitted results. If you want to avoid these issues, keep reading!
  What Is Thin Content & How Does It Affect SEO?
Is Thin Content Still a Problem in 2019?
How Does Thin Content Affect SEO?
Where Is Thin Content Found Most Often?
How to Identify Thin Content Pages
How to Fix Thin Content Issues & Avoid a Google Penalty
Make sure your site looks legit
Add more content & avoid similar titles
Don’t copy content
Web design, formatting & ads
Video, images, text, audio, etc.
Deindex/remove useless pages
  1. What Is Thin Content & How Does It Affect SEO?
  Thin content is an OnPage SEO issue that has been defined by Google as content with no added value.
  When you’re publishing content on your website and it doesn’t improve the quality of a search results page at least a little bit, you’re publishing thin content.
  For a very dull example, when you search Google for a question such as “What color is the sky?” and there’s an article out there saying “The sky is blue!”, if you publish an article with the same answer you would be guilty of adding no value.
  So does it mean that this article is thin content because there are other articles about thin content out there?
  Well.. no. Why? Because I’m adding value to it. First, I’m adding my own opinion, which is crucial. Then, I’m trying to structure it as logically as possible, address as many important issues as I can and cover gaps which I have identified from other pieces.
  Sometimes, you might not have something new to say, but you might have a better way of saying it. To go back to our example, you could say something like “The sky doesn’t really have a color but is perceived as blue by the human eye because of the way light scatters through the atmosphere.”
  Of course, you would probably have to add at least another 1500 words to that to make it seem like it’s not thin. It’s true. Longer content tends to rank better in Google, with top positions averaging about 2000 words.
  How your content should be to rank
  Sometimes, you might add value through design or maybe even through a faster website. There are multiple ways through which you can add value. We’ll talk about them soon.
  From the Google Webmaster Guidelines page we can extract 4 types of practices which are strictly related to content quality. However, they are not easy to define!
  Automatically generated content: Simple. It’s content created by robots to replace regular content, written by humans. Don’t do it. But… some AI content marketing tools have become so advanced that it’s hard to distinguish between real and automatically generated content. Humans can write poorly too. Don’t expect a cheap freelancer who writes 1000 words for $1 to have good grammar and copy. A robot might be better. But theoretically, that’s against the rules.
Thin affiliate pages: If you’re publishing affiliate pages which don’t include reviews or opinions, you’re not providing any new value to the users compared to what the actual store is already providing on their sales page.
Scraped or copied content: The catch here is to have original content. If you don’t have original content, you shouldn’t be posting it to claim it’s yours. However, even when you don’t claim it’s yours, you can’t expect Google to rank it better than the original source. Maybe there can be a reason (better design, faster website) but, generally, nobody would say it’s fair. Scraping is a no no and Google really hates it.
Doorway pages: Doorway pages are pages created to target and rank for a variety of very similar queries. While this is bad in Google’s eyes, the search giant doesn’t provide an alternative to doorway pages. If you have to target 5-10 similar queries (let’s say if you’re doing local SEO for a client), you might pull something off with one page, but if you have to target thousands of similar queries, you won’t be able to do it. A national car rental service, for example, will always have pages which could be considered doorways.
  If you want, you can listen to Matt Cutts’ explanation from this video.
youtube
    As you can see, it all revolves around value. The content that you publish must have some value to the user. If it’s just there because you want traffic, then you’re doing it wrong.
  But value can sometimes be hard to define. For some, their content might seem as the most valuable, while for others it might seem useless. For example, one might write “Plumbing services New York, $35 / hour, Phone number”. The other might write “The entire history of plumbing, How to do it yourself, Plumbing services New York, $35 / hour, Phone number.”
  Which one is more relevant? Which one provides more value? It really depends on the user’s intent. If the user just wants a plumber, they don’t want to hear about all the history. They just want a phone number and a quick, good service.
  However, what’s important to understand is that there is always a way to add value.
  In the end, it’s the search engine that decides, but there are some guidelines you can follow to make sure Google sees your content as valuable. Keep reading and you’ll find out all about them. But first, let’s better understand why thin content is still an issue and how it actually affects search engine optimization.
  1.1 Is Thin Content Still a Problem in 2019?
  The thin content purge started on February 23, 2011 with the first Panda Update. At first, Google introduced the thin content penalty because many people were generating content automatically or were creating thousands of irrelevant pages.
  The series of further updates were successful and many websites with low quality content got penalized or deranked. This pushed site owners to write better content.
  Unfortunately, today this mostly translates to longer content. The more you write, the more value you can provide, right? We know it’s not necessarily the case, but as I’ve said, longer content does tend to rank better in Google. Be it because the content makes its way up there or because the search engine is biased towards it… it’s hard to tell.
  But there’s also evidence that long form content gets more shares on social media. This can result in more backlinks, which translates to better rankings. So it’s not directly the fact that the content is long, but rather an indirect factor related to it.
  It’s kind of ironic, as Google sometimes uses its answer boxes to give a very ‘thin’ answer to questions that might require more context to be well understood.
  However, in 2019 it’s common SEO knowledge that content must be of high quality. The issue today shifts to the overload of content that is constantly being published. Everything is, at least to some extent, qualitative.
  But it’s hard to get all the information from everywhere and you don’t always know which source to rely on or trust. That’s why content curation has been doing so well lately.
  This manifests itself in other areas, especially where there’s a very tough competition, such as eCommerce.
  1.2 How Does Thin Content Affect SEO?
  Google wants to serve its users the best possible content it can. If Google doesn’t do that, then its users won’t return to Google and could classify it as a poor quality service. And that makes the search engine unhappy.
  Google generally applies a manual action penalty to websites it considers to contain thin content. You will see it in the Google Search Console (former Google Webmaster Tools) and it looks like this:
  However, your site can still be affected by thin content even if you don’t get a warning from Google in your Search Console account. That’s because you’re diluting your site’s value and burning through your crawl budget.
  The problem that search engines have is that they constantly have to crawl a lot of pages. The more pages you give it to crawl, the more work it has to do.
  If the pages the search engine crawls are not useful for the users, then Google will have a problem with wasting its time on your content.
  1.3 Where Is Thin Content Found Most Often?
  Thin content is found most of the time on bigger websites. For the sake of helping people that really need help, let’s exclude spammy affiliate websites and automated blogs from this list.
  Big websites, like eCommerce stores, often have a hard time coming up with original, high quality content for all their pages, especially for thousands of product pages.
    In the example above, you can see that although the Product Details section under the image is expanded, there’s no content there. This means that users don’t have any details at all about the dress. All they know is that it’s a dress, it’s black and it costs about $20.
  This doesn’t look too bad when you’re looking as a human at a single page, but when you’re a search engine and take a look at thousands and thousands of pages just like this one, then you begin to see the issue.
  The solution here is to add some copy. Think of what users want to know about your product. Make sure you add the details about everything they might want to know and make them easily accessible!
  Sometimes, thin content makes its way into eCommerce sites unnoticed. For example, you might have a category page which hosts a single product. Compared to all your other categories or competitor websites,that can be seen as thin content.
  2. How to Identify Thin Content Pages
  If we are referring merely to its size, then thin content can be easily identified using the cognitiveSEO Tool’s Site Audit.
  Did you know?
Identifying thin content is actually really easy with a tool like cognitiveSEO Site Audit. The tool has a Thin Content section where you can easily find the pages with issues.
It’s as simple as that! Once you have your list, you can export it and start adding some content to those pages. This will improve their chances to make it to the top of the search results.
  However, you also want to take a look at the duplicate content section in the Site Audit tool. This can also lead to a lot of indexation & content issues.
    Extremely similar pages can be “combined” using canonical tags. Sometimes it can be a good idea to remove them completely from the search engine results.
  3. How to Fix Thin Content Issues & Avoid a Google Penalty
  Sometimes, you can fix thin content issues easily, especially if you get a manual penalty warning. At least if your website isn’t huge. If you have thousands of pages, it might take a while till you can fix them.
  Here’ s a happy ending case from one of Doug Cunnington’s students:
youtube
    However, the “penalty” can also come from the algorithm and you won’t even know it’s there because there is no warning. It’s not actually a penalty, it’s just the fact that Google won’t rank your pages because of their poor quality.
  When that’s the case, it might not be as easy to get things fixed as in the video above.
  In order to avoid getting these penalties, here’s a few things that you should consider when you write content.
  3.1 Make sure your site looks legit
  First of all, if your website looks shady, then you have a higher chance of getting a manual penalty on your website. If someone from Google reviews your website and decides it looks spammy at a first glance, they will be more likely to consider penalizing it.
  To avoid this, make sure you:
  Use an original template and customize it a little bit
Have a logo or some sort of original branding
Provide an about page and contact details
  3.2 Add more content & avoid very similar titles
  The best way to show Google that your pages are worth taking a look at is to not leave them empty. In 2019, I hope we all know that for good OnPage SEO we need to add a little bit more content.
  Your pages should have at least 300 words of copy. Notice how I say copy, not words. If you’re there to sell, write copy. Even on an eCommerce product page.
  If you’re not sure what to write about, you can always use the CognitiveSEO Keyword Tool & Content Assistant. It will give you ideas on what you should write on your pages to make them relevant for the query you want them to rank on.
  Automatically generated titles can also quickly trigger Google’s alarms. If you review multiple products from the same brand and your titles are like this:
  Nike Air Max 520 Review
Nike Air Max 620 Review
Nike Air Max 720 Review
  then you can see how it might be an issue. Do those articles provide any value or are they all the same except for one digit?
  It’s important to have the keywords in your title, but you can also try to add some diversity to them. It’s not always very hard to do. A good example could be:
  Nike Air Max 520 Review | Best bang for the buck
Nike Air Max 620 | A Comprehensive Review Regarding Comfort
Nike Air Max 720 | Review After 2 Weeks of Wearing Them at The Gym
  But Adrian, I have an eCommerce site with over 2000 products, I can’t write original titles for all of them!
  That’s why I said that content isn’t the only way you can provide value with. If you can’t change the titles and content, improve some other areas.
  However, the truth is that there’s someone out there who does optimize and show love to all their titles, even if there are 2000 of them. So why shouldn’t they be rewarded for it?
  Usually, very similar titles are a result of content duplication issues. If you have a product that comes in 100 different colors, you don’t necessarily need to have 100 different pages with 100 unique titles and copy. You can just make them 1 single page where users can select their color without having to go to another URL.
  Combining pages can also be done via canonical tags, although it’s recommended to only keep this for duplicate content. Pages with different colors can count as duplicate content, as only one word is different, so the similarity is 99.9%.
  Make sure that the pages that get canonicalized don’t provide organic search traffic. For example, if people search for “blue dress for ladies” then it’s a good idea to have a separate page that can directly rank for that query instead of canonicalizing it to the black version.
  A proper faceted navigation can help you solve all these SEO issues.
  3.3 Don’t copy content
  Copying content from other websites will definitely make your site look bad in Google’s eyes.
  Again, this happens mostly on eCommerce websites, where editors get the descriptions directly from the producer’s official website. Many times they also duplicate pages in order to save time and just change a couple of words.
  On the long run, this will definitely get you into duplicate content issues, which can become very hard to fix once they’re out of control. It will also tell Google that your site endorses competitors. By using their copy, you’re considering it valuable, right?
  3.4 Web design, formatting & ads
  Sometimes, you can identify gaps in web design or formatting. That’s not easy to do, as you’ll have to manually take a look at your competitor’s websites. Here are some questions you should ask yourself:
  Are competitors presenting their information in an unpleasant manner? Do they have too many pop-ups, too many ads or very nasty designs?
  Then that’s obviously where you can make a difference. This doesn’t give you the right not to have an original copy, but it might have a greater impact.
  Source: premiumcoding.com
  3.5 Video, images, text, audio, etc.
  Big, successful eCommerce businesses which have an entire community supporting them and backing them up have used this technique for a long time: video content.
  This might work better in some niches, such as tech. In Romania, cel.ro has a very bad reputation with delivery and quality, yet it still has a decent amount of market share due to its strong video content marketing strategy.
  If you want to improve the value of your page, make sure you add images, videos or whatever you think might better serve your user. If you’re a fashion store, images might be your priority, while if you’re an electronics store, the product specifications should be more visible instead.
  3.6 Deindex useless pages
  Sometimes, when you have a lot of very similar pages that host thin content with no added value, the only viable solution is to remove those pages completely.
  This can be done in a number of ways. However, the best ones are:
  Removing the content altogether
Using canonical tags to combine them
Using robots.txt & noindex
  However, you’ll have to choose carefully which method you use. Remember, you don’t want to remove those pages with search demand from the search engines!
  Source: Moz.com
  This can determine you to switch the focus from optimizing individual product pages to optimizing category pages.
  Conclusion
  Thin content is definitely bad for your website. It’s always better to avoid an issue from the beginning than to have to fix it later on. This saves you both time and money.
  However, you’ll have to know about these issues early on, before you even start setting up your website and content marketing strategy. Hopefully, this article helped you have a better understanding on the topic.
  Have you ever faced thin content issues on your websites in your digital marketing journey? How do you identify it? And how did you solve these content issues? Let us know in the comments section below!
The post Thin Content & SEO | How to Avoid a Google Thin Content Penalty appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.
from Marketing https://cognitiveseo.com/blog/22582/thin-content-google-penalty-seo/ via http://www.rssmix.com/
0 notes
wjwilliams29 · 6 years ago
Text
Thin Content & SEO | How to Avoid a Google Thin Content Penalty
We live in a world of information overload. If 10 years ago it was hard to find content at all, now there’s way too much of it! Which one is good? Which one is bad? We don’t know.
  While this subject is very complex, it’s clear that Google is attempting to solve these content issues in its search results. One of the biggest issues they’ve encountered in the digital marketing world is what they call thin content.
    But what exactly is thin content? Should you worry about it? Can it affect your website’s SEO in a negative way? Well, thin content can get your site manually penalized but it can also sometimes send your website in Google’s omitted results. If you want to avoid these issues, keep reading!
  What Is Thin Content & How Does It Affect SEO?
Is Thin Content Still a Problem in 2019?
How Does Thin Content Affect SEO?
Where Is Thin Content Found Most Often?
How to Identify Thin Content Pages
How to Fix Thin Content Issues & Avoid a Google Penalty
Make sure your site looks legit
Add more content & avoid similar titles
Don’t copy content
Web design, formatting & ads
Video, images, text, audio, etc.
Deindex/remove useless pages
  1. What Is Thin Content & How Does It Affect SEO?
  Thin content is an OnPage SEO issue that has been defined by Google as content with no added value.
  When you’re publishing content on your website and it doesn’t improve the quality of a search results page at least a little bit, you’re publishing thin content.
  For a very dull example, when you search Google for a question such as “What color is the sky?” and there’s an article out there saying “The sky is blue!”, if you publish an article with the same answer you would be guilty of adding no value.
  So does it mean that this article is thin content because there are other articles about thin content out there?
  Well.. no. Why? Because I’m adding value to it. First, I’m adding my own opinion, which is crucial. Then, I’m trying to structure it as logically as possible, address as many important issues as I can and cover gaps which I have identified from other pieces.
  Sometimes, you might not have something new to say, but you might have a better way of saying it. To go back to our example, you could say something like “The sky doesn’t really have a color but is perceived as blue by the human eye because of the way light scatters through the atmosphere.”
  Of course, you would probably have to add at least another 1500 words to that to make it seem like it’s not thin. It’s true. Longer content tends to rank better in Google, with top positions averaging about 2000 words.
  How your content should be to rank
  Sometimes, you might add value through design or maybe even through a faster website. There are multiple ways through which you can add value. We’ll talk about them soon.
  From the Google Webmaster Guidelines page we can extract 4 types of practices which are strictly related to content quality. However, they are not easy to define!
  Automatically generated content: Simple. It’s content created by robots to replace regular content, written by humans. Don’t do it. But… some AI content marketing tools have become so advanced that it’s hard to distinguish between real and automatically generated content. Humans can write poorly too. Don’t expect a cheap freelancer who writes 1000 words for $1 to have good grammar and copy. A robot might be better. But theoretically, that’s against the rules.
Thin affiliate pages: If you’re publishing affiliate pages which don’t include reviews or opinions, you’re not providing any new value to the users compared to what the actual store is already providing on their sales page.
Scraped or copied content: The catch here is to have original content. If you don’t have original content, you shouldn’t be posting it to claim it’s yours. However, even when you don’t claim it’s yours, you can’t expect Google to rank it better than the original source. Maybe there can be a reason (better design, faster website) but, generally, nobody would say it’s fair. Scraping is a no no and Google really hates it.
Doorway pages: Doorway pages are pages created to target and rank for a variety of very similar queries. While this is bad in Google’s eyes, the search giant doesn’t provide an alternative to doorway pages. If you have to target 5-10 similar queries (let’s say if you’re doing local SEO for a client), you might pull something off with one page, but if you have to target thousands of similar queries, you won’t be able to do it. A national car rental service, for example, will always have pages which could be considered doorways.
  If you want, you can listen to Matt Cutts’ explanation from this video.
youtube
    As you can see, it all revolves around value. The content that you publish must have some value to the user. If it’s just there because you want traffic, then you’re doing it wrong.
  But value can sometimes be hard to define. For some, their content might seem as the most valuable, while for others it might seem useless. For example, one might write “Plumbing services New York, $35 / hour, Phone number”. The other might write “The entire history of plumbing, How to do it yourself, Plumbing services New York, $35 / hour, Phone number.”
  Which one is more relevant? Which one provides more value? It really depends on the user’s intent. If the user just wants a plumber, they don’t want to hear about all the history. They just want a phone number and a quick, good service.
  However, what’s important to understand is that there is always a way to add value.
  In the end, it’s the search engine that decides, but there are some guidelines you can follow to make sure Google sees your content as valuable. Keep reading and you’ll find out all about them. But first, let’s better understand why thin content is still an issue and how it actually affects search engine optimization.
  1.1 Is Thin Content Still a Problem in 2019?
  The thin content purge started on February 23, 2011 with the first Panda Update. At first, Google introduced the thin content penalty because many people were generating content automatically or were creating thousands of irrelevant pages.
  The series of further updates were successful and many websites with low quality content got penalized or deranked. This pushed site owners to write better content.
  Unfortunately, today this mostly translates to longer content. The more you write, the more value you can provide, right? We know it’s not necessarily the case, but as I’ve said, longer content does tend to rank better in Google. Be it because the content makes its way up there or because the search engine is biased towards it… it’s hard to tell.
  But there’s also evidence that long form content gets more shares on social media. This can result in more backlinks, which translates to better rankings. So it’s not directly the fact that the content is long, but rather an indirect factor related to it.
  It’s kind of ironic, as Google sometimes uses its answer boxes to give a very ‘thin’ answer to questions that might require more context to be well understood.
  However, in 2019 it’s common SEO knowledge that content must be of high quality. The issue today shifts to the overload of content that is constantly being published. Everything is, at least to some extent, qualitative.
  But it’s hard to get all the information from everywhere and you don’t always know which source to rely on or trust. That’s why content curation has been doing so well lately.
  This manifests itself in other areas, especially where there’s a very tough competition, such as eCommerce.
  1.2 How Does Thin Content Affect SEO?
  Google wants to serve its users the best possible content it can. If Google doesn’t do that, then its users won’t return to Google and could classify it as a poor quality service. And that makes the search engine unhappy.
  Google generally applies a manual action penalty to websites it considers to contain thin content. You will see it in the Google Search Console (former Google Webmaster Tools) and it looks like this:
  However, your site can still be affected by thin content even if you don’t get a warning from Google in your Search Console account. That’s because you’re diluting your site’s value and burning through your crawl budget.
  The problem that search engines have is that they constantly have to crawl a lot of pages. The more pages you give it to crawl, the more work it has to do.
  If the pages the search engine crawls are not useful for the users, then Google will have a problem with wasting its time on your content.
  1.3 Where Is Thin Content Found Most Often?
  Thin content is found most of the time on bigger websites. For the sake of helping people that really need help, let’s exclude spammy affiliate websites and automated blogs from this list.
  Big websites, like eCommerce stores, often have a hard time coming up with original, high quality content for all their pages, especially for thousands of product pages.
    In the example above, you can see that although the Product Details section under the image is expanded, there’s no content there. This means that users don’t have any details at all about the dress. All they know is that it’s a dress, it’s black and it costs about $20.
  This doesn’t look too bad when you’re looking as a human at a single page, but when you’re a search engine and take a look at thousands and thousands of pages just like this one, then you begin to see the issue.
  The solution here is to add some copy. Think of what users want to know about your product. Make sure you add the details about everything they might want to know and make them easily accessible!
  Sometimes, thin content makes its way into eCommerce sites unnoticed. For example, you might have a category page which hosts a single product. Compared to all your other categories or competitor websites,that can be seen as thin content.
  2. How to Identify Thin Content Pages
  If we are referring merely to its size, then thin content can be easily identified using the cognitiveSEO Tool’s Site Audit.
  Did you know?
Identifying thin content is actually really easy with a tool like cognitiveSEO Site Audit. The tool has a Thin Content section where you can easily find the pages with issues.
It’s as simple as that! Once you have your list, you can export it and start adding some content to those pages. This will improve their chances to make it to the top of the search results.
  However, you also want to take a look at the duplicate content section in the Site Audit tool. This can also lead to a lot of indexation & content issues.
    Extremely similar pages can be “combined” using canonical tags. Sometimes it can be a good idea to remove them completely from the search engine results.
  3. How to Fix Thin Content Issues & Avoid a Google Penalty
  Sometimes, you can fix thin content issues easily, especially if you get a manual penalty warning. At least if your website isn’t huge. If you have thousands of pages, it might take a while till you can fix them.
  Here’ s a happy ending case from one of Doug Cunnington’s students:
youtube
    However, the “penalty” can also come from the algorithm and you won’t even know it’s there because there is no warning. It’s not actually a penalty, it’s just the fact that Google won’t rank your pages because of their poor quality.
  When that’s the case, it might not be as easy to get things fixed as in the video above.
  In order to avoid getting these penalties, here’s a few things that you should consider when you write content.
  3.1 Make sure your site looks legit
  First of all, if your website looks shady, then you have a higher chance of getting a manual penalty on your website. If someone from Google reviews your website and decides it looks spammy at a first glance, they will be more likely to consider penalizing it.
  To avoid this, make sure you:
  Use an original template and customize it a little bit
Have a logo or some sort of original branding
Provide an about page and contact details
  3.2 Add more content & avoid very similar titles
  The best way to show Google that your pages are worth taking a look at is to not leave them empty. In 2019, I hope we all know that for good OnPage SEO we need to add a little bit more content.
  Your pages should have at least 300 words of copy. Notice how I say copy, not words. If you’re there to sell, write copy. Even on an eCommerce product page.
  If you’re not sure what to write about, you can always use the CognitiveSEO Keyword Tool & Content Assistant. It will give you ideas on what you should write on your pages to make them relevant for the query you want them to rank on.
  Automatically generated titles can also quickly trigger Google’s alarms. If you review multiple products from the same brand and your titles are like this:
  Nike Air Max 520 Review
Nike Air Max 620 Review
Nike Air Max 720 Review
  then you can see how it might be an issue. Do those articles provide any value or are they all the same except for one digit?
  It’s important to have the keywords in your title, but you can also try to add some diversity to them. It’s not always very hard to do. A good example could be:
  Nike Air Max 520 Review | Best bang for the buck
Nike Air Max 620 | A Comprehensive Review Regarding Comfort
Nike Air Max 720 | Review After 2 Weeks of Wearing Them at The Gym
  But Adrian, I have an eCommerce site with over 2000 products, I can’t write original titles for all of them!
  That’s why I said that content isn’t the only way you can provide value with. If you can’t change the titles and content, improve some other areas.
  However, the truth is that there’s someone out there who does optimize and show love to all their titles, even if there are 2000 of them. So why shouldn’t they be rewarded for it?
  Usually, very similar titles are a result of content duplication issues. If you have a product that comes in 100 different colors, you don’t necessarily need to have 100 different pages with 100 unique titles and copy. You can just make them 1 single page where users can select their color without having to go to another URL.
  Combining pages can also be done via canonical tags, although it’s recommended to only keep this for duplicate content. Pages with different colors can count as duplicate content, as only one word is different, so the similarity is 99.9%.
  Make sure that the pages that get canonicalized don’t provide organic search traffic. For example, if people search for “blue dress for ladies” then it’s a good idea to have a separate page that can directly rank for that query instead of canonicalizing it to the black version.
  A proper faceted navigation can help you solve all these SEO issues.
  3.3 Don’t copy content
  Copying content from other websites will definitely make your site look bad in Google’s eyes.
  Again, this happens mostly on eCommerce websites, where editors get the descriptions directly from the producer’s official website. Many times they also duplicate pages in order to save time and just change a couple of words.
  On the long run, this will definitely get you into duplicate content issues, which can become very hard to fix once they’re out of control. It will also tell Google that your site endorses competitors. By using their copy, you’re considering it valuable, right?
  3.4 Web design, formatting & ads
  Sometimes, you can identify gaps in web design or formatting. That’s not easy to do, as you’ll have to manually take a look at your competitor’s websites. Here are some questions you should ask yourself:
  Are competitors presenting their information in an unpleasant manner? Do they have too many pop-ups, too many ads or very nasty designs?
  Then that’s obviously where you can make a difference. This doesn’t give you the right not to have an original copy, but it might have a greater impact.
  Source: premiumcoding.com
  3.5 Video, images, text, audio, etc.
  Big, successful eCommerce businesses which have an entire community supporting them and backing them up have used this technique for a long time: video content.
  This might work better in some niches, such as tech. In Romania, cel.ro has a very bad reputation with delivery and quality, yet it still has a decent amount of market share due to its strong video content marketing strategy.
  If you want to improve the value of your page, make sure you add images, videos or whatever you think might better serve your user. If you’re a fashion store, images might be your priority, while if you’re an electronics store, the product specifications should be more visible instead.
  3.6 Deindex useless pages
  Sometimes, when you have a lot of very similar pages that host thin content with no added value, the only viable solution is to remove those pages completely.
  This can be done in a number of ways. However, the best ones are:
  Removing the content altogether
Using canonical tags to combine them
Using robots.txt & noindex
  However, you’ll have to choose carefully which method you use. Remember, you don’t want to remove those pages with search demand from the search engines!
  Source: Moz.com
  This can determine you to switch the focus from optimizing individual product pages to optimizing category pages.
  Conclusion
  Thin content is definitely bad for your website. It’s always better to avoid an issue from the beginning than to have to fix it later on. This saves you both time and money.
  However, you’ll have to know about these issues early on, before you even start setting up your website and content marketing strategy. Hopefully, this article helped you have a better understanding on the topic.
  Have you ever faced thin content issues on your websites in your digital marketing journey? How do you identify it? And how did you solve these content issues? Let us know in the comments section below!
The post Thin Content & SEO | How to Avoid a Google Thin Content Penalty appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.
0 notes
krisggordon · 6 years ago
Text
Thin Content & SEO | How to Avoid a Google Thin Content Penalty
We live in a world of information overload. If 10 years ago it was hard to find content at all, now there’s way too much of it! Which one is good? Which one is bad? We don’t know.
  While this subject is very complex, it’s clear that Google is attempting to solve these content issues in its search results. One of the biggest issues they’ve encountered in the digital marketing world is what they call thin content.
    But what exactly is thin content? Should you worry about it? Can it affect your website’s SEO in a negative way? Well, thin content can get your site manually penalized but it can also sometimes send your website in Google’s omitted results. If you want to avoid these issues, keep reading!
  What Is Thin Content & How Does It Affect SEO?
Is Thin Content Still a Problem in 2019?
How Does Thin Content Affect SEO?
Where Is Thin Content Found Most Often?
How to Identify Thin Content Pages
How to Fix Thin Content Issues & Avoid a Google Penalty
Make sure your site looks legit
Add more content & avoid similar titles
Don’t copy content
Web design, formatting & ads
Video, images, text, audio, etc.
Deindex/remove useless pages
  1. What Is Thin Content & How Does It Affect SEO?
  Thin content is an OnPage SEO issue that has been defined by Google as content with no added value.
  When you’re publishing content on your website and it doesn’t improve the quality of a search results page at least a little bit, you’re publishing thin content.
  For a very dull example, when you search Google for a question such as “What color is the sky?” and there’s an article out there saying “The sky is blue!”, if you publish an article with the same answer you would be guilty of adding no value.
  So does it mean that this article is thin content because there are other articles about thin content out there?
  Well.. no. Why? Because I’m adding value to it. First, I’m adding my own opinion, which is crucial. Then, I’m trying to structure it as logically as possible, address as many important issues as I can and cover gaps which I have identified from other pieces.
  Sometimes, you might not have something new to say, but you might have a better way of saying it. To go back to our example, you could say something like “The sky doesn’t really have a color but is perceived as blue by the human eye because of the way light scatters through the atmosphere.”
  Of course, you would probably have to add at least another 1500 words to that to make it seem like it’s not thin. It’s true. Longer content tends to rank better in Google, with top positions averaging about 2000 words.
  How your content should be to rank
  Sometimes, you might add value through design or maybe even through a faster website. There are multiple ways through which you can add value. We’ll talk about them soon.
  From the Google Webmaster Guidelines page we can extract 4 types of practices which are strictly related to content quality. However, they are not easy to define!
  Automatically generated content: Simple. It’s content created by robots to replace regular content, written by humans. Don’t do it. But… some AI content marketing tools have become so advanced that it’s hard to distinguish between real and automatically generated content. Humans can write poorly too. Don’t expect a cheap freelancer who writes 1000 words for $1 to have good grammar and copy. A robot might be better. But theoretically, that’s against the rules.
Thin affiliate pages: If you’re publishing affiliate pages which don’t include reviews or opinions, you’re not providing any new value to the users compared to what the actual store is already providing on their sales page.
Scraped or copied content: The catch here is to have original content. If you don’t have original content, you shouldn’t be posting it to claim it’s yours. However, even when you don’t claim it’s yours, you can’t expect Google to rank it better than the original source. Maybe there can be a reason (better design, faster website) but, generally, nobody would say it’s fair. Scraping is a no no and Google really hates it.
Doorway pages: Doorway pages are pages created to target and rank for a variety of very similar queries. While this is bad in Google’s eyes, the search giant doesn’t provide an alternative to doorway pages. If you have to target 5-10 similar queries (let’s say if you’re doing local SEO for a client), you might pull something off with one page, but if you have to target thousands of similar queries, you won’t be able to do it. A national car rental service, for example, will always have pages which could be considered doorways.
  If you want, you can listen to Matt Cutts’ explanation from this video.
youtube
    As you can see, it all revolves around value. The content that you publish must have some value to the user. If it’s just there because you want traffic, then you’re doing it wrong.
  But value can sometimes be hard to define. For some, their content might seem as the most valuable, while for others it might seem useless. For example, one might write “Plumbing services New York, $35 / hour, Phone number”. The other might write “The entire history of plumbing, How to do it yourself, Plumbing services New York, $35 / hour, Phone number.”
  Which one is more relevant? Which one provides more value? It really depends on the user’s intent. If the user just wants a plumber, they don’t want to hear about all the history. They just want a phone number and a quick, good service.
  However, what’s important to understand is that there is always a way to add value.
  In the end, it’s the search engine that decides, but there are some guidelines you can follow to make sure Google sees your content as valuable. Keep reading and you’ll find out all about them. But first, let’s better understand why thin content is still an issue and how it actually affects search engine optimization.
  1.1 Is Thin Content Still a Problem in 2019?
  The thin content purge started on February 23, 2011 with the first Panda Update. At first, Google introduced the thin content penalty because many people were generating content automatically or were creating thousands of irrelevant pages.
  The series of further updates were successful and many websites with low quality content got penalized or deranked. This pushed site owners to write better content.
  Unfortunately, today this mostly translates to longer content. The more you write, the more value you can provide, right? We know it’s not necessarily the case, but as I’ve said, longer content does tend to rank better in Google. Be it because the content makes its way up there or because the search engine is biased towards it… it’s hard to tell.
  But there’s also evidence that long form content gets more shares on social media. This can result in more backlinks, which translates to better rankings. So it’s not directly the fact that the content is long, but rather an indirect factor related to it.
  It’s kind of ironic, as Google sometimes uses its answer boxes to give a very ‘thin’ answer to questions that might require more context to be well understood.
  However, in 2019 it’s common SEO knowledge that content must be of high quality. The issue today shifts to the overload of content that is constantly being published. Everything is, at least to some extent, qualitative.
  But it’s hard to get all the information from everywhere and you don’t always know which source to rely on or trust. That’s why content curation has been doing so well lately.
  This manifests itself in other areas, especially where there’s a very tough competition, such as eCommerce.
  1.2 How Does Thin Content Affect SEO?
  Google wants to serve its users the best possible content it can. If Google doesn’t do that, then its users won’t return to Google and could classify it as a poor quality service. And that makes the search engine unhappy.
  Google generally applies a manual action penalty to websites it considers to contain thin content. You will see it in the Google Search Console (former Google Webmaster Tools) and it looks like this:
  However, your site can still be affected by thin content even if you don’t get a warning from Google in your Search Console account. That’s because you’re diluting your site’s value and burning through your crawl budget.
  The problem that search engines have is that they constantly have to crawl a lot of pages. The more pages you give it to crawl, the more work it has to do.
  If the pages the search engine crawls are not useful for the users, then Google will have a problem with wasting its time on your content.
  1.3 Where Is Thin Content Found Most Often?
  Thin content is found most of the time on bigger websites. For the sake of helping people that really need help, let’s exclude spammy affiliate websites and automated blogs from this list.
  Big websites, like eCommerce stores, often have a hard time coming up with original, high quality content for all their pages, especially for thousands of product pages.
    In the example above, you can see that although the Product Details section under the image is expanded, there’s no content there. This means that users don’t have any details at all about the dress. All they know is that it’s a dress, it’s black and it costs about $20.
  This doesn’t look too bad when you’re looking as a human at a single page, but when you’re a search engine and take a look at thousands and thousands of pages just like this one, then you begin to see the issue.
  The solution here is to add some copy. Think of what users want to know about your product. Make sure you add the details about everything they might want to know and make them easily accessible!
  Sometimes, thin content makes its way into eCommerce sites unnoticed. For example, you might have a category page which hosts a single product. Compared to all your other categories or competitor websites,that can be seen as thin content.
  2. How to Identify Thin Content Pages
  If we are referring merely to its size, then thin content can be easily identified using the cognitiveSEO Tool’s Site Audit.
  Did you know?
Identifying thin content is actually really easy with a tool like cognitiveSEO Site Audit. The tool has a Thin Content section where you can easily find the pages with issues.
It’s as simple as that! Once you have your list, you can export it and start adding some content to those pages. This will improve their chances to make it to the top of the search results.
  However, you also want to take a look at the duplicate content section in the Site Audit tool. This can also lead to a lot of indexation & content issues.
    Extremely similar pages can be “combined” using canonical tags. Sometimes it can be a good idea to remove them completely from the search engine results.
  3. How to Fix Thin Content Issues & Avoid a Google Penalty
  Sometimes, you can fix thin content issues easily, especially if you get a manual penalty warning. At least if your website isn’t huge. If you have thousands of pages, it might take a while till you can fix them.
  Here’ s a happy ending case from one of Doug Cunnington’s students:
youtube
    However, the “penalty” can also come from the algorithm and you won’t even know it’s there because there is no warning. It’s not actually a penalty, it’s just the fact that Google won’t rank your pages because of their poor quality.
  When that’s the case, it might not be as easy to get things fixed as in the video above.
  In order to avoid getting these penalties, here’s a few things that you should consider when you write content.
  3.1 Make sure your site looks legit
  First of all, if your website looks shady, then you have a higher chance of getting a manual penalty on your website. If someone from Google reviews your website and decides it looks spammy at a first glance, they will be more likely to consider penalizing it.
  To avoid this, make sure you:
  Use an original template and customize it a little bit
Have a logo or some sort of original branding
Provide an about page and contact details
  3.2 Add more content & avoid very similar titles
  The best way to show Google that your pages are worth taking a look at is to not leave them empty. In 2019, I hope we all know that for good OnPage SEO we need to add a little bit more content.
  Your pages should have at least 300 words of copy. Notice how I say copy, not words. If you’re there to sell, write copy. Even on an eCommerce product page.
  If you’re not sure what to write about, you can always use the CognitiveSEO Keyword Tool & Content Assistant. It will give you ideas on what you should write on your pages to make them relevant for the query you want them to rank on.
  Automatically generated titles can also quickly trigger Google’s alarms. If you review multiple products from the same brand and your titles are like this:
  Nike Air Max 520 Review
Nike Air Max 620 Review
Nike Air Max 720 Review
  then you can see how it might be an issue. Do those articles provide any value or are they all the same except for one digit?
  It’s important to have the keywords in your title, but you can also try to add some diversity to them. It’s not always very hard to do. A good example could be:
  Nike Air Max 520 Review | Best bang for the buck
Nike Air Max 620 | A Comprehensive Review Regarding Comfort
Nike Air Max 720 | Review After 2 Weeks of Wearing Them at The Gym
  But Adrian, I have an eCommerce site with over 2000 products, I can’t write original titles for all of them!
  That’s why I said that content isn’t the only way you can provide value with. If you can’t change the titles and content, improve some other areas.
  However, the truth is that there’s someone out there who does optimize and show love to all their titles, even if there are 2000 of them. So why shouldn’t they be rewarded for it?
  Usually, very similar titles are a result of content duplication issues. If you have a product that comes in 100 different colors, you don’t necessarily need to have 100 different pages with 100 unique titles and copy. You can just make them 1 single page where users can select their color without having to go to another URL.
  Combining pages can also be done via canonical tags, although it’s recommended to only keep this for duplicate content. Pages with different colors can count as duplicate content, as only one word is different, so the similarity is 99.9%.
  Make sure that the pages that get canonicalized don’t provide organic search traffic. For example, if people search for “blue dress for ladies” then it’s a good idea to have a separate page that can directly rank for that query instead of canonicalizing it to the black version.
  A proper faceted navigation can help you solve all these SEO issues.
  3.3 Don’t copy content
  Copying content from other websites will definitely make your site look bad in Google’s eyes.
  Again, this happens mostly on eCommerce websites, where editors get the descriptions directly from the producer’s official website. Many times they also duplicate pages in order to save time and just change a couple of words.
  On the long run, this will definitely get you into duplicate content issues, which can become very hard to fix once they’re out of control. It will also tell Google that your site endorses competitors. By using their copy, you’re considering it valuable, right?
  3.4 Web design, formatting & ads
  Sometimes, you can identify gaps in web design or formatting. That’s not easy to do, as you’ll have to manually take a look at your competitor’s websites. Here are some questions you should ask yourself:
  Are competitors presenting their information in an unpleasant manner? Do they have too many pop-ups, too many ads or very nasty designs?
  Then that’s obviously where you can make a difference. This doesn’t give you the right not to have an original copy, but it might have a greater impact.
  Source: premiumcoding.com
  3.5 Video, images, text, audio, etc.
  Big, successful eCommerce businesses which have an entire community supporting them and backing them up have used this technique for a long time: video content.
  This might work better in some niches, such as tech. In Romania, cel.ro has a very bad reputation with delivery and quality, yet it still has a decent amount of market share due to its strong video content marketing strategy.
  If you want to improve the value of your page, make sure you add images, videos or whatever you think might better serve your user. If you’re a fashion store, images might be your priority, while if you’re an electronics store, the product specifications should be more visible instead.
  3.6 Deindex useless pages
  Sometimes, when you have a lot of very similar pages that host thin content with no added value, the only viable solution is to remove those pages completely.
  This can be done in a number of ways. However, the best ones are:
  Removing the content altogether
Using canonical tags to combine them
Using robots.txt & noindex
  However, you’ll have to choose carefully which method you use. Remember, you don’t want to remove those pages with search demand from the search engines!
  Source: Moz.com
  This can determine you to switch the focus from optimizing individual product pages to optimizing category pages.
  Conclusion
  Thin content is definitely bad for your website. It’s always better to avoid an issue from the beginning than to have to fix it later on. This saves you both time and money.
  However, you’ll have to know about these issues early on, before you even start setting up your website and content marketing strategy. Hopefully, this article helped you have a better understanding on the topic.
  Have you ever faced thin content issues on your websites in your digital marketing journey? How do you identify it? And how did you solve these content issues? Let us know in the comments section below!
The post Thin Content & SEO | How to Avoid a Google Thin Content Penalty appeared first on SEO Blog | cognitiveSEO Blog on SEO Tactics & Strategies.
from Marketing https://cognitiveseo.com/blog/22582/thin-content-google-penalty-seo/ via http://www.rssmix.com/
0 notes
dorothydelgadillo · 6 years ago
Text
A Google Bug May Have De-Indexed Some Of Your Site Pages (And Here's What You Need To Do About It)
On Thursday, April 4th, Google identified an issue that was de-indexing web pages from search results. The next day it claimed the issue was fully resolved and order could be restored to the universe.
However, Google is now claiming that the issue, while not fully resolved, is taking longer than expected to relist all of the URLs that were taken down last Friday.
If your web pages were deindexed, that means people can’t find you on a search engine results page (SERP). The clicks coming from these SERPs, also known as organic traffic, are the primary driver of eyeballs on most websites. So when any of your webpages are removed from the SERPs, you are losing potential traffic, leads, and ultimately sales.
What We Do (And Don’t) Know
Nothing gets past Webmasters and SEOs for very long once they start noticing a dip in traffic. Their investigations are what shined the light on Google's de-indexing of URLs and web pages.
Here’s what Google Senior Webmaster Trends Analyst John Mueller had to say when the issue was first discovered:
Google is usually very cryptic with everything that happens within its walls and this situation is no different; so we do not know exactly how much of the Google index was impacted by this, which individual sites were affected, or the amount of traffic you may have lost because of it.
But they are working on it!
Where Are We Now?
You can put down your pitchforks and torches for now because it looks like Google is making progress on resolving this issue.
How I imagine Google Senior Webmaster Trends Analyst John Mueller’s Twitter feed when something goes wrong:
via GIPHY
What To Do About It
Thanks to our resident technical SEO expert, Franco Valentino, for putting together a short process to identify whether your site was affected, and if so, how to go about re-indexing your pages.
Here's the process to follow:
Determine whether your website home page has been de-indexed by going to Google and typing "site:yourdomain.com".  If the home page isn't the first entry, it's de-indexed.
Head over to Google Search Console and enter the full path to your website domain like: "https://www.impactbnd.com" in the address bar at the top of the screen. (This assumes you're using the new version of Google Search Console)
If your home page was de-indexed, a modal window pops up, with small lettering on the bottom right that reads "request indexing"
Google Search Console will scroll through a few 'pending index - submitting' screens, and the site should reappear in about a minute.
Now, I know not all marketing teams have a dedicated Franco to keep them sharp when Google makes mistakes.
But I do hope this helps shine a light on how important it is to have someone keeping an eye on your SEO, whether that be someone internal to your team or at an agency with which you are working, so that these issues don’t happen and go unresolved for long periods of time.
from Web Developers World https://www.impactbnd.com/blog/google-bug-may-have-de-indexed-some-of-your-site-pages-and-what-to-do-about-it
0 notes
getcustomersondemand · 7 years ago
Text
Why Won’t My Website Rank on Page 1 of Google?
When prospects initially contact us regarding our SEO services it is not uncommon for us to hear them say something like “the last company we worked with actually made our site drop in the rankings,” or “the last company we worked with did not get our site to rank any better.”  While there are many reasons that either of these things may have happened the bottom line is you did not see any increase in revenue, leads, prospects, etc. as a result of the efforts, or worse you may have seen a decrease in those numbers. This article is going to explain why that may have happened and what you can do to get back on track.
There are many reasons why a site may not increase in rankings, or worse decrease in rankings when search engine optimization is being implemented.   Those reasons could be a bad SEO strategy, a change in Google’s algorithm, or it could be as simple as the product or service you offer is not one that Google wants to be seen at the top of page 1. At the time of this article, this is very common in what is referred to as “your money, your life” niche. If your product or service has to do with money or health and you are not a very well-established well-known brand in your market you will have a very difficult if not impossible time getting your site to rank in the top of the Google search engine.
Since there is not much you can do if Google does not want your product or service to rank at the top of their page based on the industry you are in, we are going to tackle the other two reasons you may be having an issue with your site rankings. We’re going to start by discussing Google’s algorithm because this is very closely related to “bad SEO.”
However, before I get started, you may want to run a complete website audit to have handy and reference throughout this article and during your research.
If you follow SEO at all or have been trying to implement it on a website that you own you have probably seen, heard, or read about many of the Google updates. The fact is Google is constantly updating their algorithms on a regular basis, and you only hear about the ones that have made a significant impact on many sites.  So if you had a site that was once ranking well but suddenly took a significant dive in the search engine results, there is a high probability that you got tripped up in one of Google’s algorithmic filters. This could be a new algorithmic filter Google has put in place or a change to a current algorithmic filter.
Google is constantly updating their algorithms in an attempt to give users the best experience possible. For example, Google cannot have a user come to their search engine and search for bananas and have results for apples come up.  Also when results for bananas display Google wants to make sure it is the best results for bananas that are possible based on the user’s intent. For this to happen, Google must put algorithms in place to try and prevent manipulation of the search results. This is where your skill as an SEO or the skill of the SEO that you hire comes into play.
It’s important for you to understand that search engine optimization of any kind is nothing more than manipulation of search engine results and that in any fashion is a violation of Google’s terms of service. The methods you use, how you implement those methods, and the degree to which you do it,  is what separates skilled SEOs from novices. It’s the difference between getting your site to rank at the top of page 1 or having it stuck where it’s at or even drop in the results.  Worse yet an awful SEO strategy can get your website deindexed from Google’s search results meaning that your site is no longer even eligible to show up in Google anywhere.  It’s the difference between making a lot of money and struggling or going out of business.
If you feel you are the victim of a bad SEO strategy or have been tripped up in one of Google’s algorithmic filters, there are some steps you can take to try and get back on track. The first thing you need to do is make sure that your site has not been deindexed in Google. This is very simple. All you need to do is go to the Google search bar and type in the following:  Site: HTTP(s):yourwebsite.com.   After typing this in the search bar hit enter. Your website should show up as the number one result in the search results. If you do not see it there or within the top three there is a very high likelihood that your site has been deindexed.   This is the highest penalty that Google serves, and you have to contact them through Google webmaster tools to have your site reconsidered for indexation.
Assuming that your site is not deindexed the next thing you should do is go into your Google search console and see if you have any messages regarding actions that Google took against your site.   If your site is not connected to Google search console, you need to do that first. If you find messages in there or what is referred to as manual actions this will tell you any actions that Google took against your site and what you can do to try and fix the problem. The most common manual action that is seen inside the Google search console is the “unnatural links penalty.” This means that Google has detected what they believe to be unnatural linking (i.e SEO) to your site. If this is the case, you need to start by disavowing some of the incoming links. Once you do that, submit your site for reconsideration. If Google does not accept it, they will usually give you a “hint” on what links to disavow next.
Now just because you do not see any manual actions inside your Google search console does not mean that actions have not been taken against your site. It simply means that a human did not manually look at your site, but it is still very possible that actions have been taken based on an algorithmic filter. Algorithmic filter penalties are more difficult to figure out. To figure out what algorithmic penalty your site has been hit with it often requires you changing and testing one component at a time.  The most common components to test are site content (too much or too little) and links (too many, wrong anchor text, wrong type of sites).
Now you have checked for manual actions and know that you either had none or that you addressed them. You have tested for algorithmic penalties and either addressed those or excluded them from being an issue affecting the ranking of your site. You now need to consider a bad SEO strategy from the standpoint of either not enough is being done to create movement in your site, or the wrong strategies are being implemented.
You should think of SEO as a car race. To win a car race, you need two things. A properly built car and the fuel to make it run.  In SEO your website is your car.  You need to make sure that your car is built properly because if it is not, it doesn’t matter what kind or how much fuel you put in it is not going to run correctly. Things that you need to make your car (website) run properly are referred to as on-page factors.   The fuel that you need to put in your car to make it run after it’s built properly is referred to as off-page factors.  One factor without the other will simply leave you stuck at the start line. Here is a good post explaining on page and off page SEO factors in more detail.
Once you have made sure that your site has no penalties whether manual or algorithmic, and you have optimized your website with all on-page factors and have begun implementing off-page factors, unfortunately, all you can do is wait and monitor the results.  SEO requires patience. While many variables are involved that affect the time it takes to see any results, generally speaking, any changes you make related to on page factors should be recognized within about two weeks on average. Off-page factors such as linking might take anywhere between four and six weeks to be recognized and as much as 90 days to feel it’s complete impact.
Proper SEO requires a high degree of skill and experience.  If you have any doubts whatsoever regarding your ability in this area, and you rely on your website and online traffic to generate revenue I recommend you consult a professional. When you work with the right partner, the results will speak for themselves, and you will wish you had done it sooner.  If you are interested in working with us, you can see if you qualify and request a free video analysis.
The post Why Won’t My Website Rank on Page 1 of Google? appeared first on getcustomersondemand.com.
0 notes
greengabbards · 7 years ago
Text
Ultimate Reputation Management Guide Pt 6
Ultimate Reputation Management Guide Pt 6
In the last lesson, we talked about how you can create a number of assets, including WordPress installs, Facebook pages, Twitter profiles, Blogger blogs, and number of other websites and social media accounts you might need to build a strong reputation online. In this lesson, I will teach you how to develop your asset linking strategy, which will help your SEO tremendously by helping you interlink your websites to increase your Google Page Rank.
As we touched on in a previous lesson, it’s really important to get your linking strategy correct. Google is very good at ferreting out people who try to use linking strategies to game their system. One of the most important factors Google uses to determine where your page should rank is the number of backlinks you have to your site. This means that you have to be cautious when creating backlinks through social media profiles or any websites you own because there is a right way to do it and a wrong way. If you do it the wrong way, Google will penalize you in search results and may even deindex your website, which means that your website won’t show up in Google search results at all. This could cost your brand a ton of traffic, and since we obviously don’t want that, we’re going to teach you how to link to your websites in a way that improves your search engine rankings.
Red Flags to Asset Linking
Now, there are many dangers to asset linking that you should be aware of. These have come about thanks to Google’s update called Panda 3.4. This update devalues your backlinks when you commit the following actions:
Linking to your site with the same exact anchor text from a number of sites
Having a high velocity of links going to your site (a large number of links in a small time period)
Trading links with websites that do not have a high Google PageRank
All of these actions used to be common practice in search engine optimization, but have more recently fallen out of favor because Google devalues these links and these backlinks do not help you rise in the search results.
What Google is looking for instead is a natural progression of growth on your website. This means that Google wants to see backlinks that come slowly over time from a variety of websites using a variety of anchor text. Remember, Google’s goal is to produce the best search results for people, using an algorithm. They rely on information that comes from people, like backlinks, to weigh heavily in their algorithm and count as a human vote for the credibility of a website.
How To Create a Natural Backlinking Strategy For Your Website
I can teach you how to get backlinks to your site in an authentic way that looks natural to Google’s search engine algorithm. We have a three-step process that will help you build these links without getting penalized in Google:
Step 1: Create a Map of All Your Online Assets
In part five of this series on reputation management, we talked about creating your company assets like “[company name] reviews.com” or “[company name] reviews” as a Facebook page or as a Twitter account. The first thing you want to do is group all these assets according to the keyword used to create them. For example, under your “reviews” grouping, you’ll put all of your assets that target the keyword “reviews.” You’ll also group all the assets that target the keyword “[company name] [your city] .com,” and so on. We recommend that you do these groupings in an Excel spreadsheet so you can stay organized.
Step 2: Choose a Focal Point in Each Grouping
The #1 rule in creating an asset linking strategy is that you can’t have all of your sites pointing to each other randomly. There must be one main site that all the other sites are pointing to reep the rewards of this strategy. So in each grouping, you need to choose a focal site that you can link all the other sites to.
The focal point for your grouping is usually going to be your main “.com” asset. For example, your focal point for the “reviews” group would be “[company name] reviews.com.”
All of your focal point assets are going to eventually link to your main company website. This creates a three-tiered hierarchy of links, which we have found is the safest way to influence Google search results without looking unnatural and setting off a red flag.
Step 3: Link All Your Assets in Each Grouping to the Focal Point in Each Grouping
Lastly, you want to go to each site and put a link to the focal point of the grouping that site is in. This is going to increase your Google PageRank and will let Google know that this asset is the main one. Google will give this asset the most weight and the most credibility in its search engine results. Furthermore, all of these assets on sites like Quora, Facebook, and Twitter are going to lead to your focal points, which are each going to point back to your main company site.
Pitfalls to Avoid When Creating an Asset Linking Strategy
Before I set you loose with this asset linking strategy, I want to caution you against some of the recent updates that have caused companies short or long-term positional drops without warning. The best way to link to your site strategically and naturally is to know the things that Google is looking for when trying to ferret out over-optimized sites.
1. Acquiring Excessive Links in a Short Amount of Time (Link Acquisition Velocity)
You want to keep track of your site’s link acquisition velocity because not only could you make a mistake and link too quickly, but your competitors could also attempt to use this tactic to get your website delisted from Google. You can check your site’s link acquisition velocity using one of two tools:
Ahrefs
Majestic
Also, don’t think that you can try to avoid this pitfall by gaining links from authoritative sites only. While you want to gain backlinks from sites with authority, if your website suddenly spikes with authoritative backlinks it may still send a red flag to Google. You can check this by using a link profile tool created by Tom Anthony.
Finally, though it may be tempting to buy backlinks in order to increase your search engine ranking quickly, these rarely end up being worthwhile. First of all, Google does not allow paid links of any kind and if they find out you’ve purchased backlinks for your website, you will be penalized. Second, these links are rarely worth it because, while they might drive some traffic, it will likely be of low quality.
2. Excessive Site-wide Links From Websites that Link to You
Site-wide links are the ones found in a blog roll, sidebar, header, or footer. These types of links show up on every page of your website, which comes up in Google search algorithm as a website linking to you excessively. You can check who is linking to you with a site-wide link using Google Webmaster tools. Go to your site on the web and click on links to your site. You’ll be able to see the people who are linking to you most. If anyone is linking to you more than 20 times, they are probably linking to you with a site-wide link.
While Google’s Webmaster Tools is probably the quickest way to check on this, it doesn’t always report on the backlinks that Google actually sees. If you want to get more information, you can use a third-party service such as:
Ahrefs
Majestic SEO
Open Site Explorer (Soon to be depricated)
Keep in mind, however, that these sites are not trying to replicate Google’s behavior, and may also provide significantly skewed data. This can happen when Google removes a site from its index but the third-party tool still reports those links. Most of these third party tools do not account for deindexed websites and will simply provide you with all the information about all the websites that link to yours.
Another problem with site-wide links is that they look like link exchanges. A link exchange is when someone links to you in return for you linking to him or her. Google frowns upon this backlink method, though they don’t ban it completely.
Often, link exchanges get you into more trouble than they are worth because you are linking to a website that may not be high quality, which hurts you, and you are gaining a link from a website that might not be sending you high quality traffic. We recommend avoiding link exchanges with anyone who does not have a highly relevant site to yours.
3. Similar Anchor Text in Every Backlink
Humans who are not coordinating their efforts to link to a particular site would naturally use different anchor text in every backlink. Google gets very suspicious when it sees a number of sites that link to one website with the same anchor text. This means that in your linking hierarchy, you want to use a variety of anchor text phrases that are related to your keywords.
A great example of a company who had this problem is JCPenney. They tried to game the system by getting unrelated websites to link to them using a single keyword phrase as anchor text. The New York Times found out how JCPenney had gained such a high ranking in Google so quickly and outed them in the press. A Google employee named Matt Cutts ended up deranking them because they were in violation of Google’s webmaster guidelines.
You can check the anchor text of all your backlinks by exporting anchor text data from as many different sources as you can find. We recommend:
Ahrefs
Majestic SEO
Open Site Explorer
First, export the anchor texted you find from all of these sites and put it into one spreadsheet. Then, you want to start filtering the data to get one cohesive data set that makes sense for what you’re trying to understand. We recommend filtering out the following:
Duplicates – Since you are pulling from multiple tools, you’re likely to find duplicate data
Dead links – Google doesn’t use these because they are from sites that used to link to you but currently don’t
No follow links – Google doesn’t follow these and they are unlikely to cause over-optimization issues
Site-wide links – Google counts a link from a domain to a particular page on your website once, so you should too
Links from websites that have been de-indexed by Google – Since Google is not looking at these, there’s no point for you to look at them either
From there, you can use Excel to classify different anchor text variations and spot weaknesses in your backlinks. You’ll need to use your data analysis skills and may want to create a couple graphs to help you see patterns in the data. You also want to look for any keywords that have an excess of backlinks associated with them.
Advanced Techniques For Ranking Higher in Google Search Results
Now that we’ve gone over all the mistakes people make, I want to share with you a few advanced tips that will help you make your link building look natural.
1. Using Junk Anchors
In the real world, not everyone is going to link to your website using optimized anchor text. In fact, a number a people are going to link to your website using phrases like, “Click Here!” Or “Read Now.” This is the type of text that people use when they are creating calls-to-action within their text. You can actually rank faster in Google search results if you include these types of junk anchors in your optimization.
A bonus reason to use these junk anchors in your copy is that they help increase conversion for the click-through rate of the link. Since your goal is to drive as much traffic to your website as possible, this advanced tip can help you improve your traffic results.
2. Using Semantic Keyword Phrases
Semantics is the study of meaning and interpretation in words. You want to use semantic keywords when linking to your content. By using semantically equivalent words in your keyword phrases, you can better describe to Google what your website is about.
The classic example of semantic keywords is “lemon.” Does it refer to a color, a fruit, a scent? In this case, it refers to a defective car. So instead of using “lemon” as the anchor text, you would want to use longer phrases like, “how to spot a lemon on the used car lot.” Then, you’d want to come up with semantically equivalent phrases, like “how to spot a defective car on the used car lot.”
If you’re interested in digging deeper into this topic, Brian Clark of Copyblogger has provided a simple guide to semantic keyword research on his website.
3. Using Brand Mentions as Anchor Text
Typically, when humans link to each other they use names and brands within the anchor text. For example, if John Carpenter wanted to link to something I’ve written on greengabbard's blog, he would probably link using an anchor text that read something like, “I came across this awesome article on search engine optimization by Micah Gabbard of greenGabbard Media.” because people like to give credit where credit is due, Google often rewards brand mentions within the anchor text. Link strategies that do not include brand mentions signal to Google that someone is trying to game their system.
4. Gaining Social Links on Social Media Websites
Google’s #1 goal is to create search engine results that are relevant to as many people as possible, so it is always looking for ways to incorporate social proof into its algorithm. That’s why gaining attention from humans is so valuable to search engine optimization. You want to get as many shares for your content as you can cross social media outlets like Facebook, Twitter, Delicious, StumbleUpon, and many other social sharing sites.
You can increase social sharing by putting links to share prominently at the bottom of all of your content and by asking your readers to share the content within your blog posts. You can also create great content that social influencers find valuable and want to promote to their audiences. We’ll go into much more detail about how to create excellent content in a later lesson in this reputation management guide.
Creating a Long-Lasting Linking Strategy
While this asset linking strategy may take a little longer to generate results in Google search, I guarantee that it’s the best way to teach Google about your website. We’ve seen companies go overboard when creating online assets because they immediately start linking each asset to all their other assets. This helps get as many links as possible, but it also creates a link exchange, which looks unnatural and sends a red flag to Google. We have seen this backfire many times and have watched companies lose all the rankings because Google has delisted them from search engine results.
It’s worthwhile to take your time and create a hierarchy of links with your linking strategy. This will position you for long-term success.
Thanks for reading this part of the guide
If you have any questions about reputation management, shoot me an email at [email protected]
https://greengabbard.com/reputation-management/ultimate-reputation-management-guide-pt-6/
0 notes
amrutservices · 7 years ago
Text
Why High Quality Content Matters More than Keywords for SEO
Attention content creators: Google reads everything you write! Well, not “reads” in the literal sense, but its algorithms are now sophisticated enough to pick up on unnatural language and poor formatting—both of which send strong negative signals that hurt your ability to rank.
In fact, Google’s approach to ranking has gotten so sophisticated that they’ve learned that content quality matters more to search users than the presence of any particular keyword phrase. As a result, you may find a No. 1 search result that doesn’t contain an exact match keyword anywhere in the body.
We’re serious! In an exhaustive study of 600,000 keyword phrases, 18 percent of the domains that ranked position 20 or higher didn’t have the keyword in the text at all. Instead, these sites had a few things in common: website visits, user behavior signals and the number of links to the content all influenced Google to rank them near the top. All of these signals tell Google one thing: people seem to like this content.
In addition to these behavior-based markers of content quality, Google and other search engines actively sift through content to see signals of quality within the text itself.
After all, Google’s main objective isn’t getting your website traffic; it’s giving people good search results.
Thankfully, the company’s own guidelines are fairly specific and helpful. We’ll point you towards the exact markers of “high quality” Google is looking for.
What Are the Red Flags for Poor Content Quality?
Google’s guidelines for content quality are pretty thorough. This is likely because it’s hard to put into words exactly what makes something “good” or “high quality.” It takes a lot of nuance!
On the other hand, you can fairly quickly point out factors that immediately signal poor quality.
It’s like baking cake. There are a million different types of cakes out there and as many ways to prepare them. Flour, sugar, eggs and milk may be your raw ingredients, but you can make thousands of different types of delicious cakes. Also, “the right cake to bake” differs according to the context and circumstances. You can have a moist cake that’s yummy, or you could have a more solid cake that still does the trick.
But you can’t put sand in your cake. That’s a no-no. And it’s an automatic recipe for an inedible cake.
Similarly, Google highlights some markers of poor quality that instantly flag a page as having content not worth ranking:
Spamming keywords, especially if they’re irrelevant
Creating content that’s mostly copies of existing content
Typos, bad spelling, grammar errors
Sentences or paragraphs that never seem to end
Content that has little to no formatting, leaving just a dense chunk of text
Going crazy with links that aren’t relevant to the content at hand
Dropping lists of keywords somewhere in your page, especially if you’re hiding them with text color choices
Content that is excessively thin, especially for pages like blogs that promise substance
There are also a number of ways to get instantly deindexed by Google that go beyond content quality. Since that’s something you likely want to avoid, they’re well worth reviewing!
Google’s SEO Guide Considers Content Quality, Navigation Ease More Important Than Keyword Use
If you go and take a look at Google’s SEO starter guide, you’ll find that suggestions for how to use keywords properly don’t come up until around halfway through. Before that point, they take a moment to repeat four times that you shouldn’t overuse keywords or stuff them into your technical SEO elements.
Once they do mention keywords, they simply advise that you tailor your keyword strategy to your audience. For instance, people who watch soccer regularly might expect “FIFA” or “football” to be in the content they read, while casual users may expect more generic terms like “soccer playoffs.”
Immediately after that, they go back into quality. “Avoid writing sloppy text with many spelling and grammatical mistakes,” they suggest, as well as “awkward or poorly written content.”
To truly hammer the point home, Google spends far more time writing about ease of navigation and quality of life improvements for website visitors. Based on how the information is organized, Google cares more about your site map than your keyword usage when deciding rank.
“The navigation of a website is important in helping visitors quickly find the content they want,” explains the search giant. “It can also help search engines understand what content the webmaster thinks is important.”
All of this information can be summed up thusly: search engines aren’t dumb. They know the things that make life easier for their users and content better to read in general. They pay far more attention to these elements than how you use keywords.
In fact, with voice search on the rise, search engines have had to get smarter than ever about interpreting keyword intent and finding semantically related terms. That way, someone searching for “best places to eat near me” can pull up a list of “top-rated restaurants” without having to first sift through unhelpful results that contain exact keyword matches.
5 Tips for Writing Higher Quality Content
So now you’ve heard what definitely not to do when creating content, with only a hint of what so-called “high quality content” looks like.
To steer you in the right direction, here are a few general tips that can boost the quality of all content.
“Make pages primarily for users, not for search engines.”
This rule comes directly from Google’s Webmaster Guidelines. It’s actually the very first thing they say under “Basic Principles.”
The search giant even suggests you ask yourself “Does this help my users? Would I do this if search engines didn’t exist?” when making a decision on how your website operates. Those questions definitely apply when writing new content.
So foremost, determine an audience need based on a keyword search, and write to answer that need. The better able you are to satisfy someone’s search intent, the better behavior signals your site receives, and the more likely you are to rank.
If you’re at a loss for how to connect a keyword to user needs, do a little research. Plug in the keyword yourself, and try to find questions related to it.
Or, if the keyword is directly related to an “I want to purchase something or research a purchase” intent, take notes on the content that ranks highest. Chances are good that the page offers excellent examples of site organization, layout clarity and overall usability in addition to some solid text content.
Edit Your Writing, and Push Yourself to Improve 
Like good cake, good writing is definitely in the eye of the beholder. But at the same time, you wouldn’t bank on your cake getting top votes if all you did was use a box mix.
In other words, if you want to write better, you’re going to have to learn from others. We suggest reading publisher sites related to your industry that get high traffic, and cover topics similar to what you want on your blog.
Some general guidelines for improving your writing include:
Use less “being” and “linking” verbs in favor of strong action verbs. If you find yourself writing words like “is, was, are and be,” go back and see if you can identify the true subject of the sentence and what it’s doing.
Structure your writing like you would an outline. Tell people what they’re going to learn from your post as soon as possible, and then delve into each smaller point one at a time until you’re finished.
Write casually but not unprofessionally. Aim for a “friendly, conversational tone with a clear purpose—somewhere between the voice you use when talking to your buds and that you’d use if you were a robot,” suggests Search Engine Land’s paraphrasing of Google’s own Developer Documentation Style Guide.
Edit your writing! Far too many people don’t go back and reread. Watch out for sentence and paragraph transitions that could make people have trouble following your logic. Ask people for their opinion on how readable everything is. If they have a complaint, see if you can break the excerpt down into its most simple parts and reconstruct it.
Read, Read, Read and Read Some More
Reading teaches you how words and sentences form ideas. We take a lot of this stuff for granted, but it’s quite complex. Fortunately, others have mastered it and can teach you techniques to add to your repertoire.
Pay Attention to Your Audience’s Behavior Signals
What content pages get the most views? Which ones get the best responses or the most engagement in comments or on social media? Where do people tend to spend the most time?
Look to your own Google Analytics data, and try to identify patterns. People tell you what they like without ever having to say a word.
If You’re Struggling to Write Good Content, Go Back to the Basics
You may feel hesitant about writing on simple topics, such as “The Beginner’s Guide to SEO” or something like “Why People Buy Things,” but these are actually great topics. Yes, they’ve been done to death, but they help people learn.
Also, you might put things in a certain way that makes an extremely deep or complex subject click for your audience.
Above all else, articles like these teach you the fundamentals of writing for your audience. You learn how to break big concepts down to their bare components and communicate complex ideas with clarity.
Next to reading, writing down the basics is the best way to teach yourself how to craft better content.
Stop Obsessing Over Keywords and Start Writing Better
The writing’s on the wall: Google and online audiences are sick of bad content, keyword stuffing and deceptive practices aimed to help websites rank but that make readers miserable. 
Put content quality factors like readability, grammar and topic organization as a higher priority than keyword use. People will know what you’re talking about, even if you don’t use an exact keyword match—and now search engines will too.
from Amrut Services https://amrutservices.com/why-high-quality-content-matters-more-than-keywords-for-seo/
0 notes
coronationimphx · 7 years ago
Link
When search engines don’t index your site, all your SEO efforts are in vain. That’s search engine optimization in a nutshell. Luckily, there are seven proven ways to achieve SEO success.
1) Get your site up to speed
The need for speed is the number one reason why your SEO is a failure. The telltale signs are 500 server errors that show up in the handy Google and Bing tools that webmasters use. So when you get any of these errors, you know what’s happening.
When search engines themselves are having difficulty accessing your Web pages, this means that the amount of SEO effort that you’ve managed to put in isn’t working. The bots are quite resilient in that they will keep trying but only to a certain extent. Once they’ve been attempting enough without achieving any degree of success, they’ll have no choice but to give your website a failing grade.
2) Avoid the copycat syndrome
Copying content is the very antithesis of SEO. Just like your school teacher, search bots will inevitably compare two or more pages that are identical and discard any indexation effort altogether. As a result, all identical pages get a failing grade. School justice rules even in the online world.
The same rule applies for very similar pages. Let’s say that it’s a 75% match. This is enough to sound the alarm bells. Again, all parties get the boot. So clearly, you’re not achieving any amount of SEO success here.
On the other hand, your SEO is getting somewhere when the web crawlers start to see real signs of uniqueness on your part, like genuinely unique content, headings, and titles. Achieving this kind of SEO glory often eludes e-commerce sites by virtue of their identical descriptions for products.
3) When specific changes are bad for SEO
Be very careful about updating your site design and architecture. Web crawlers double up as sentinels at beginnings and ends of your messages and anything in between. So once previous correlations or logical relationships disappear, the smart bots start to wonder what’s happening. In return for this perceived hocus pocus, the bots don’t just spit out warnings. They are quick to issue a demotion just like your boss does at work. So in spite of all the bleeding edge SEO on your part, your website is still going nowhere.
Let’s say for instance that your web designer has decided to move the text within an image grid just because it happens to look more appealing that way. Such an innocent move is enough to impact the SEO of your site in a negative manner. The bots are looking for the meat of the message and all of a sudden it’s not there. What do you think they will do?
4) How new URLs can affect SEO negatively
Of all the sites, the E-commerce platform loves to make updates to the URLs the most. This is often done to economize on space and text or in response to changing product data. However, if you look at this from the SEO perspective, these changes present the bots with a multiple choice question. A) Do they index the old site? B) No, the new site; C) Index both A and B; and, D) Index none of the above.
So the question for you is, are you willing to take the risk which can affect your site’s SEO in one way or the other? Or would you rather not take chances at all? The smart SEO expert will know what to do in this case. Go figure.
5) How deletion or redirection presents virtual spiders with an SEO conundrum
When your webmaster loves to delete pages or redirects, this can also have a negative impact on SEO. For just like in the previous example, the site designer is making the bots go through a multiple choice question. And more often than not, this will lead to an SEO decrease rather than an increase. Hence, the situation is worse than the previous example.
6) The science of robotics
Only robots can understand one another entirely. Accordingly, the bots and the robot’s commands that exist have a kind of affinity that is hard to break. Take for example Robots.txt, which the vast majority of search bots will follow to a T. And the same can be said for the index attribute, which does the opposite. Either way, robotic commands speak the natural language of the bots. Therefore, you must use them for SEO glory no matter how archaic they may be. For just like meta tags, they are the oldies but goodies of SEO.
7) Google Search Console versus Bing Webmaster Tools
No matter whether you are a big Google or a Bing fan, at the end of the day, you must use both to take SEO to new heights. There are no ifs or buts; no one or the other. Both are too powerful to do a coin toss. Ultimately, they both do a great job of achieving a balancing act when it comes to indexing and deindexing specific parts of your website. So for the best SEO results, apply one and then the other. Only then can you say that you’ve done your job.
Figuring out your website’s indexation issues can be extremely hard and frustrating. You may even have on-site quality issues that Google is penalizing you for. If you need help with this, our SEO experts in Las Vegas at http://phoenixseoconsultants.com/seo/las-vegas-nv can diagnose and fix your issues.
.embed-container { position: relative; padding-bottom: 56.25%; height: 0; overflow: hidden; max-width: 100%; } .embed-container iframe, .embed-container object, .embed-container embed { position: absolute; top: 0; left: 0; width: 100%; height: 100%; }
The post SEO: Seven Answers Why Your Website’s Indexation Has Reduced appeared first on The Phoenix SEO Consultants.
http://ifttt.com/images/no_image_card.png via http://phoenixseoconsultants.com
0 notes
contractor-media · 7 years ago
Text
Why High Quality Content Matters More than Keywords for SEO
Attention content creators: Google reads everything you write! Well, not “reads” in the literal sense, but its algorithms are now sophisticated enough to pick up on unnatural language and poor formatting—both of which send strong negative signals that hurt your ability to rank.
In fact, Google’s approach to ranking has gotten so sophisticated that they’ve learned that content quality matters more to search users than the presence of any particular keyword phrase. As a result, you may find a No. 1 search result that doesn’t contain an exact match keyword anywhere in the body.
We’re serious! In an exhaustive study of 600,000 keyword phrases, 18 percent of the domains that ranked position 20 or higher didn’t have the keyword in the text at all. Instead, these sites had a few things in common: website visits, user behavior signals and the number of links to the content all influenced Google to rank them near the top. All of these signals tell Google one thing: people seem to like this content.
In addition to these behavior-based markers of content quality, Google and other search engines actively sift through content to see signals of quality within the text itself.
After all, Google’s main objective isn’t getting your website traffic; it’s giving people good search results.
Thankfully, the company’s own guidelines are fairly specific and helpful. We’ll point you towards the exact markers of “high quality” Google is looking for.
What Are the Red Flags for Poor Content Quality?
Google’s guidelines for content quality are pretty thorough. This is likely because it’s hard to put into words exactly what makes something “good” or “high quality.” It takes a lot of nuance!
On the other hand, you can fairly quickly point out factors that immediately signal poor quality.
It’s like baking cake. There are a million different types of cakes out there and as many ways to prepare them. Flour, sugar, eggs and milk may be your raw ingredients, but you can make thousands of different types of delicious cakes. Also, “the right cake to bake” differs according to the context and circumstances. You can have a moist cake that’s yummy, or you could have a more solid cake that still does the trick.
But you can’t put sand in your cake. That’s a no-no. And it’s an automatic recipe for an inedible cake.
Similarly, Google highlights some markers of poor quality that instantly flag a page as having content not worth ranking:
Spamming keywords, especially if they’re irrelevant
Creating content that’s mostly copies of existing content
Typos, bad spelling, grammar errors
Sentences or paragraphs that never seem to end
Content that has little to no formatting, leaving just a dense chunk of text
Going crazy with links that aren’t relevant to the content at hand
Dropping lists of keywords somewhere in your page, especially if you’re hiding them with text color choices
Content that is excessively thin, especially for pages like blogs that promise substance
There are also a number of ways to get instantly deindexed by Google that go beyond content quality. Since that’s something you likely want to avoid, they’re well worth reviewing!
Google’s SEO Guide Considers Content Quality, Navigation Ease More Important Than Keyword Use
If you go and take a look at Google’s SEO starter guide, you’ll find that suggestions for how to use keywords properly don’t come up until around halfway through. Before that point, they take a moment to repeat four times that you shouldn’t overuse keywords or stuff them into your technical SEO elements.
Once they do mention keywords, they simply advise that you tailor your keyword strategy to your audience. For instance, people who watch soccer regularly might expect “FIFA” or “football” to be in the content they read, while casual users may expect more generic terms like “soccer playoffs.”
Immediately after that, they go back into quality. “Avoid writing sloppy text with many spelling and grammatical mistakes,” they suggest, as well as “awkward or poorly written content.”
To truly hammer the point home, Google spends far more time writing about ease of navigation and quality of life improvements for website visitors. Based on how the information is organized, Google cares more about your site map than your keyword usage when deciding rank.
“The navigation of a website is important in helping visitors quickly find the content they want,” explains the search giant. “It can also help search engines understand what content the webmaster thinks is important.”
All of this information can be summed up thusly: search engines aren’t dumb. They know the things that make life easier for their users and content better to read in general. They pay far more attention to these elements than how you use keywords.
In fact, with voice search on the rise, search engines have had to get smarter than ever about interpreting keyword intent and finding semantically related terms. That way, someone searching for “best places to eat near me” can pull up a list of “top-rated restaurants” without having to first sift through unhelpful results that contain exact keyword matches.
5 Tips for Writing Higher Quality Content
So now you’ve heard what definitely not to do when creating content, with only a hint of what so-called “high quality content” looks like.
To steer you in the right direction, here are a few general tips that can boost the quality of all content.
“Make pages primarily for users, not for search engines.”
This rule comes directly from Google’s Webmaster Guidelines. It’s actually the very first thing they say under “Basic Principles.”
The search giant even suggests you ask yourself “Does this help my users? Would I do this if search engines didn't exist?” when making a decision on how your website operates. Those questions definitely apply when writing new content.
So foremost, determine an audience need based on a keyword search, and write to answer that need. The better able you are to satisfy someone’s search intent, the better behaviour signals your site receives, and the more likely you are to rank.
If you’re at a loss for how to connect a keyword to user needs, do a little research. Plug in the keyword yourself, and try to find questions related to it.
Or, if the keyword is directly related to an “I want to purchase something or research a purchase” intent, take notes on the content that ranks highest. Chances are good that the page offers excellent examples of site organization, layout clarity and overall usability in addition to some solid text content.
Edit Your Writing, and Push Yourself to Improve
Like good cake, good writing is definitely in the eye of the beholder. But at the same time, you wouldn’t bank on your cake getting top votes if all you did was use a box mix.
In other words, if you want to write better, you’re going to have to learn from others. We suggest reading publisher sites related to your industry that get high traffic, and cover topics similar to what you want on your blog.
Some general guidelines for improving your writing include:
Use less “being” and “linking” verbs in favor of strong action verbs. If you find yourself writing words like “is, was, are and be,” go back and see if you can identify the true subject of the sentence and what it’s doing.
Structure your writing like you would an outline. Tell people what they’re going to learn from your post as soon as possible, and then delve into each smaller point one at a time until you’re finished.
Write casually but not unprofessionally. Aim for a “friendly, conversational tone with a clear purpose—somewhere between the voice you use when talking to your buds and that you’d use if you were a robot,” suggests Search Engine Land’s paraphrasing of Google’s own Developer Documentation Style Guide.
Edit your writing! Far too many people don’t go back and reread. Watch out for sentence and paragraph transitions that could make people have trouble following your logic. Ask people for their opinion on how readable everything is. If they have a complaint, see if you can break the excerpt down into its most simple parts and reconstruct it.
Read, Read, Read and Read Some More
Reading teaches you how words and sentences form ideas. We take a lot of this stuff for granted, but it’s quite complex. Fortunately, others have mastered it and can teach you techniques to add to your repertoire.
Pay Attention to Your Audience’s Behavior Signals
What content pages get the most views? Which ones get the best responses or the most engagement in comments or on social media? Where do people tend to spend the most time?
Look to your own Google Analytics data, and try to identify patterns. People tell you what they like without ever having to say a word.
If You’re Struggling to Write Good Content, Go Back to the Basics
You may feel hesitant about writing on simple topics, such as “The Beginner’s Guide to SEO” or something like “Why People Buy Things,” but these are actually great topics. Yes, they’ve been done to death, but they help people learn.
Also, you might put things in a certain way that makes an extremely deep or complex subject click for your audience.
Above all else, articles like these teach you the fundamentals of writing for your audience. You learn how to break big concepts down to their bare components and communicate complex ideas with clarity.
Next to reading, writing down the basics is the best way to teach yourself how to craft better content.
Stop Obsessing Over Keywords and Start Writing Better
The writing’s on the wall: Google and online audiences are sick of bad content, keyword stuffing and deceptive practices aimed to help websites rank but that make readers miserable.
Put content quality factors like readability, grammar and topic organization as a higher priority than keyword use as part of your SEO strategy. People will know what you’re talking about, even if you don’t use an exact keyword match—and now search engines will too.
Why High Quality Content Matters More than Keywords for SEO republished from the Contractor Marketing blog by Contractor Media
0 notes
myaussie · 7 years ago
Text
New Post has been published on Online Crowd
New Post has been published on https://onlinecrowd.com.au/why-high-quality-content-matters-more-than-keywords-for-seo/
Why High Quality Content Matters More than Keywords for SEO
Attention content creators: Google reads everything you write! Well, not “reads” in the literal sense, but its algorithms are now sophisticated enough to pick up on unnatural language and poor formatting—both of which send strong negative signals that hurt your ability to rank.
In fact, Google’s approach to ranking has gotten so sophisticated that they’ve learned that content quality matters more to search users than the presence of any particular keyword phrase. As a result, you may find a No. 1 search result that doesn’t contain an exact match keyword anywhere in the body.
We’re serious! In an exhaustive study of 600,000 keyword phrases, 18 percent of the domains that ranked position 20 or higher didn’t have the keyword in the text at all. Instead, these sites had a few things in common: website visits, user behavior signals and the number of links to the content all influenced Google to rank them near the top. All of these signals tell Google one thing: people seem to like this content.
In addition to these behavior-based markers of content quality, Google and other search engines actively sift through content to see signals of quality within the text itself.
After all, Google’s main objective isn’t getting your website traffic; it’s giving people good search results.
Thankfully, the company’s own guidelines are fairly specific and helpful. We’ll point you towards the exact markers of “high quality” Google is looking for.
What Are the Red Flags for Poor Content Quality?
Google’s guidelines for content quality are pretty thorough. This is likely because it’s hard to put into words exactly what makes something “good” or “high quality.” It takes a lot of nuance!
On the other hand, you can fairly quickly point out factors that immediately signal poor quality.
It’s like baking cake. There are a million different types of cakes out there and as many ways to prepare them. Flour, sugar, eggs and milk may be your raw ingredients, but you can make thousands of different types of delicious cakes. Also, “the right cake to bake” differs according to the context and circumstances. You can have a moist cake that’s yummy, or you could have a more solid cake that still does the trick.
But you can’t put sand in your cake. That’s a no-no. And it’s an automatic recipe for an inedible cake.
Similarly, Google highlights some markers of poor quality that instantly flag a page as having content not worth ranking:
Spamming keywords, especially if they’re irrelevant
Creating content that’s mostly copies of existing content
Typos, bad spelling, grammar errors
Sentences or paragraphs that never seem to end
Content that has little to no formatting, leaving just a dense chunk of text
Going crazy with links that aren’t relevant to the content at hand
Dropping lists of keywords somewhere in your page, especially if you’re hiding them with text color choices
Content that is excessively thin, especially for pages like blogs that promise substance
There are also a number of ways to get instantly deindexed by Google that go beyond content quality. Since that’s something you likely want to avoid, they’re well worth reviewing!
Google’s SEO Guide Considers Content Quality, Navigation Ease More Important Than Keyword Use
If you go and take a look at Google’s SEO starter guide, you’ll find that suggestions for how to use keywords properly don’t come up until around halfway through. Before that point, they take a moment to repeat four times that you shouldn’t overuse keywords or stuff them into your technical SEO elements.
Once they do mention keywords, they simply advise that you tailor your keyword strategy to your audience. For instance, people who watch soccer regularly might expect “FIFA” or “football” to be in the content they read, while casual users may expect more generic terms like “soccer playoffs.”
Immediately after that, they go back into quality. “Avoid writing sloppy text with many spelling and grammatical mistakes,” they suggest, as well as “awkward or poorly written content.”
To truly hammer the point home, Google spends far more time writing about ease of navigation and quality of life improvements for website visitors. Based on how the information is organized, Google cares more about your site map than your keyword usage when deciding rank.
“The navigation of a website is important in helping visitors quickly find the content they want,” explains the search giant. “It can also help search engines understand what content the webmaster thinks is important.”
All of this information can be summed up thusly: search engines aren’t dumb. They know the things that make life easier for their users and content better to read in general. They pay far more attention to these elements than how you use keywords.
In fact, with voice search on the rise, search engines have had to get smarter than ever about interpreting keyword intent and finding semantically related terms. That way, someone searching for “best places to eat near me” can pull up a list of “top-rated restaurants” without having to first sift through unhelpful results that contain exact keyword matches.
5 Tips for Writing Higher Quality Content
So now you’ve heard what definitely not to do when creating content, with only a hint of what so-called “high quality content” looks like.
To steer you in the right direction, here are a few general tips that can boost the quality of all content.
“Make pages primarily for users, not for search engines.”
This rule comes directly from Google’s Webmaster Guidelines. It’s actually the very first thing they say under “Basic Principles.”
The search giant even suggests you ask yourself “Does this help my users? Would I do this if search engines didn’t exist?” when making a decision on how your website operates. Those questions definitely apply when writing new content.
So foremost, determine an audience need based on a keyword search, and write to answer that need. The better able you are to satisfy someone’s search intent, the better behaviour signals your site receives, and the more likely you are to rank.
If you’re at a loss for how to connect a keyword to user needs, do a little research. Plug in the keyword yourself, and try to find questions related to it.
Or, if the keyword is directly related to an “I want to purchase something or research a purchase” intent, take notes on the content that ranks highest. Chances are good that the page offers excellent examples of site organization, layout clarity and overall usability in addition to some solid text content.
Edit Your Writing, and Push Yourself to Improve
Like good cake, good writing is definitely in the eye of the beholder. But at the same time, you wouldn’t bank on your cake getting top votes if all you did was use a box mix.
In other words, if you want to write better, you’re going to have to learn from others. We suggest reading publisher sites related to your industry that get high traffic, and cover topics similar to what you want on your blog.
Some general guidelines for improving your writing include:
Use less “being” and “linking” verbs in favor of strong action verbs. If you find yourself writing words like “is, was, are and be,” go back and see if you can identify the true subject of the sentence and what it’s doing.
Structure your writing like you would an outline. Tell people what they’re going to learn from your post as soon as possible, and then delve into each smaller point one at a time until you’re finished.
Write casually but not unprofessionally. Aim for a “friendly, conversational tone with a clear purpose—somewhere between the voice you use when talking to your buds and that you’d use if you were a robot,” suggests Search Engine Land’s paraphrasing of Google’s own Developer Documentation Style Guide.
Edit your writing! Far too many people don’t go back and reread. Watch out for sentence and paragraph transitions that could make people have trouble following your logic. Ask people for their opinion on how readable everything is. If they have a complaint, see if you can break the excerpt down into its most simple parts and reconstruct it.
Read, Read, Read and Read Some More
Reading teaches you how words and sentences form ideas. We take a lot of this stuff for granted, but it’s quite complex. Fortunately, others have mastered it and can teach you techniques to add to your repertoire.
Pay Attention to Your Audience’s Behavior Signals
What content pages get the most views? Which ones get the best responses or the most engagement in comments or on social media? Where do people tend to spend the most time?
Look to your own Google Analytics data, and try to identify patterns. People tell you what they like without ever having to say a word.
If You’re Struggling to Write Good Content, Go Back to the Basics
You may feel hesitant about writing on simple topics, such as “The Beginner’s Guide to SEO” or something like “Why People Buy Things,” but these are actually great topics. Yes, they’ve been done to death, but they help people learn.
Also, you might put things in a certain way that makes an extremely deep or complex subject click for your audience.
Above all else, articles like these teach you the fundamentals of writing for your audience. You learn how to break big concepts down to their bare components and communicate complex ideas with clarity.
Next to reading, writing down the basics is the best way to teach yourself how to craft better content.
Stop Obsessing Over Keywords and Start Writing Better
The writing’s on the wall: Google and online audiences are sick of bad content, keyword stuffing and deceptive practices aimed to help websites rank but that make readers miserable.
Put content quality factors like readability, grammar and topic organization as a higher priority than keyword use. People will know what you’re talking about, even if you don’t use an exact keyword match—and now search engines will too.
If you need help generating leads, please call us. We have an excellent lead generation program where we pay for all your advertising expenses and take all the risk for you. You only pay when you get results. This is a massive undertaking and understandably we’re limited in how many companies we can help at any one time. If you’re interested and find the program paused or not available, just leave your details and we’ll let you know when it re-opens.
  Thanks!  Click here to download the free pdf file. 
0 notes
thexploreking · 7 years ago
Text
Ultimate On-Site SEO Guide for Your website & Online Store
Tumblr media
As bloggers and Store Owners we want to make our websites appeal to both search engines and users. A search engine ‘looks’ at a website quite differently to us humans.  Your website design, fancy graphics and well written blog posts count for nothing if you have missed out the critical on-site SEO covered in this post. SEO is not dead. Modern SEO is not about fooling the search engines – it is about working with the search engines to present your products and content in the best possible way. In this post, SEO Expert -Itamar Gero shows us exactly how…
The Ultimate On-Site SEO Guide for store owners and bloggers
Ecommerce is huge business. According to the  ecommerce sales in the US alone hit $394.9 billion in 2016. The rising trend in eCommerce sales and its influence on offline retail is expected to continue well into the 2020s worldwide. (adsbygoogle = window.adsbygoogle || ).push({});   SEO Guide To Your Online Store
Tumblr media
SEO Guide: This SEO guide and tutorial is one of the most in depth we have ever produced. It is easy to be overwhelmed and to feel muddled with SEO but in reality it is much more straightforward that it may initially appear. The rewards for getting Search Engine Optimization right are HUGE! Two SEO resources you will need: => Google Search Console => Screaming Frog SEO Spider
What is Ecommerce SEO?
Simply put, ecommerce SEO is the process of optimizing an online store for greater visibility on search engines. It has four main facets, including: Keyword research On-site SEO Link Building Usage Signal Optimization In this post, I’ll tackle the most foundational and arguably the most crucial among the four areas: on-site SEO. In our experience, working with 1000s of agencies, we can attribute the greatest impact on overall organic traffic growth to optimizations that we did within the ecommerce sites we handle. While link building and other off-page SEO activities are important, on-site SEO sets the tone for success each and every time. The Resurgence of On-Site SEO On-site SEO is the collective term used to describe all SEO activities done within the website. It can be divided into two segments: technical SEO and on-page SEO. Technical SEO mostly deals with making sure that the site stays up, running and available for search bot crawls. On-page SEO, on the other hand, is geared more towards helping search engines understand the contextual relevance of your site to the keywords you’re targeting. I’m focusing on on-site SEO today because of the undeniable resurgence it’s been having during the past 3 years or so. You see, the SEO community went through a phase in the late 2000s to around 2011 when everyone was obsessed with the acquisition of inbound links. Back then, links were far and away the most powerful determinant of Google rankings. Ecommerce sites that were locked in brutal ranking battles were at the core of this movement and competition eventually revolved around the  matter of who’s able to get the most links – ethically or otherwise. Eventually, Google introduced the Panda and Penguin updates which punished a lot of sites that proliferated link and content spam. Quality links that boosted rankings became harder to come by, making on-site SEO signal influence more pronounced over the rankings. The SEO community soon started seeing successful ecommerce SEO campaigns that focused more on technical and on-page optimization than heavy link acquisition. This post will show you what you can do on your site to take advantage of the on-site SEO renaissance: Technical SEO As mentioned earlier, technical SEO is mostly about making sure your site has good uptime, loads fast, offers secure browsing to users and facilitates good bot and user navigation through its pages. Here’s a list of things you need to monitor constantly to ensure a high level of technical health: The Robots.txt File The robots.txt file is a very small document that search bots access when they visit your site. This document tells them which pages can be accessed, which bots are welcome to do so and which pages are off-limits. When robots.txt disallows access to a certain page or directory path within a website, bots from responsible sites will adhere to the instruction and not visit that page at all. That means disallowed pages will not be listed in search results. Whatever link equity flowing to them is nullified and these pages will not be able to pass any link equity as well. When checking your robots.txt file, make sure that all pages meant for public viewing don’t fall under any disallow parameters. Similarly, you’ll want to make sure that pages which don’t serve the intent if your target audience are barred from indexing. Having more pages indexed by search engines may sound like a good thing, but it really isn’t. Google and other web portals constantly try to improve the quality of listings displayed in their SERPs. (search engine results pages) Therefore, they expect webmasters to be judicious in the pages they submit for indexing and they reward those who comply. In general, search engine queries fall under one of three classifications: Navigational Transactional Informational If your pages don’t satisfy the intents behind any of these, consider using robots.txt to prevent bot access to them. Doing so will make better use of your crawl budget and make your site’s internal link equity flow to more important pages. In an ecommerce site, the types of URLs that you usually want to bar access to are: Checkout pages – These are series of pages that shoppers use to choose and confirm their purchases. These pages are unique to their sessions and therefore, are of no interest to anyone else on the Internet. Dynamic pages – These pages are created through unique user requests such as internal searches and page filtering combinations. Like checkout pages, these pages are generated for one specific user who made the request. Therefore, they’re of no interest to most people on the Web, making the impetus for search engines to index them very weak. Further, these pages eventually expire and send out 404 Not Found responses when re-crawled by search engines. That can be taken as a signal of poor site health that can negatively impact an online store’s search visibility. Dynamic pages can easily be identified by the presence of the characters “?” and “=” in their URLs. You can prevent them from being indexed by adding a line in the robots.txt file that says something like this: disallow: *? Staging Pages – These are pages that are currently under development and unfit for public viewing. Make sure to set up a path in your site’s directory specifically for staging webpages and make sure the robots.txt file is blocking that directory. Backend pages – These pages are for site administrators only. Naturally, you’ll want the public not to visit the pages – much less find them in search results. Everything from your admin login page down to the internal site control pages must be placed under a robots.txt restriction to prevent unauthorized entry. Note that the robots.txt file isn’t the only way to restrict the indexing of pages. The noindex meta directive tag, among others, can also be used for this purpose. Depending on the nature of the deindexing situation, one may be more appropriate than the other. The XML Sitemap The XML sitemap is another document that search bots read to get useful information. This file lists all the pages that you want Google and other spiders to crawl. A good sitemap contains information that gives bots an idea of your information architecture, the frequency at which each page is modified and the whereabouts of assets, such as images, within your domain’s file paths. While XML sitemaps are not a necessity in any website, they’re very important to online stores due to the number of pages that a typical ecommerce site has. With a sitemap in place and submitted to tools like Google Search Console, search engines tend to find and index pages that are deep within your site’s hierarchy of URLs. Your web developer should be able to set up an XML sitemap for your ecommerce site. More often than not, ecommerce sites already have it by the time their development cycles are finished. You can check this by going to . If you see something like this, you already have your XML sitemap up and running:
Tumblr media
Having an XML Sitemap isn’t a guarantee that all the URLs listed on it will be considered for indexing. Submitting the sitemap to Google Search Console ensures that the search giant’s bot finds and reads the sitemap. To do this, simply log in to your Search Console account and find the property you want to manage. Go to Crawl>Sitemaps and click the “Add/Test New Sitemap” button on the upper right. Just enter the URL slug of your XML sitemap and click “Submit.” You should be able to see data on your sitemap in 2-4 days. This is what it will look like:
Tumblr media
Notice that the report tells you how many pages are submitted (listed) in the sitemap and how many Google indexed. In a lot of cases, ecommerce sites will not get every page they submit in the sitemap indexed by Google. A page may not be indexed due to one of several reasons including: The URL is Dead – If a page has been deliberately deleted or is afflicted with technical problems, it will likely yield 4xx or 5xx errors. If the URL is listed in the sitemap, Google will not index a page from a URL that is not working properly. Similarly, if a once-live page that’s listed in the sitemap goes down for long periods of time, it may be taken off the Google index. The URL is Redirected – When a URL is redirected and is yielding either a 301 or a 302 response code, there’s no sense to have it in the sitemap. The redirect’s target page should instead be listed if it’s not there already. If a redirecting URL is listed in a sitemap, there’s a good chance Google will simply ignore it and report it as not indexed. The URL is being Blocked – As discussed under the robots.txt section, not all pages in an ecommerce site need to be indexed. If a webpage is being blocked by robots.txt or the noindex meta tag, there’s no sense listing it on the XML sitemap. Search Console will count it as not being indexed precisely because you asked it not to be. Checkout pages, blog tag pages and other product pages with duplicate content are examples of pages that need not be listed in the XML sitemap. The URL has a Canonical Link – The rel=canonical HTML tag is often used in online stores to tell search engines which page they want indexed out of several very similar pages. This often happens when a product has multiple SKUs with very small distinguishing attributes. Instead of having Google choose which one to display on the SERPs, webmasters gained the ability to tell search engines which page is the “real” one that they want featured. If your ecommerce site has product pages that have the rel=canonical element, there’s no need to list them on your sitemap. Google will likely ignore them anyway and honor the one they’re pointing to. The Page has Thin Content – Google defines thin content as pages with little to no added value. Examples include pages with little to no textual content or pages that do have text but are duplicating other pages from within the site or from elsewhere in the web. When Google deems a page as being thin, it either disfavors it in the search results or ignores it outright. If you have product pages that carry boilerplate content lifted from manufacturer sites or other pages from your site, it’s usually smart to block indexing on them until you have the time and manpower to write richer and more unique descriptions. It also follows that you should avoid listing these pages on your XML sitemap just because they’re less likely to be indexed. There is a Page-Level Penalty – In rare instances, search engines might take manual or algorithmic actions against sites that violate their quality guidelines. If a page is spammy, or has been hacked and infused with malware, it may be taken off the index. Naturally, you’ll want pages like these off your sitemap. The URL is Redundant – Duplicate URLs in the XML sitemap, as you may expect, will not be listed twice. The second one will likely be ignored and left off the index. You can solve this issue by opening your sitemap on your browser and saving it as an XML document that you can open in Excel. From there, go to the Data tab. Highlight the column where the URLs are in your sitemap and click on remove Duplicates. Restricted Pages – Pages that are password-protected or are only granting access to specific IPs will not be crawled by search engines and therefore not indexed. The fewer inappropriate pages you list in your sitemap, the better your submission to indexing ratio will be. This helps search engines understand which pages within your domain hold the highest degrees of importance, allowing them to perform better for keywords they represent. (adsbygoogle = window.adsbygoogle || ).push({}); Crawl Errors In online stores, products are routinely added and deleted depending on a lot of factors. When indexed product or category pages are deleted, it doesn’t necessarily mean that search engines automatically forget about them. Bots will continue to attempt crawls of these URLs for a few months until they’re fixed or taken off the index due to chronic unavailability. In technical terms, crawl errors are pages that bots can’t successfully access because they return HTTP error codes. Among these codes, 404 is the most common but others in the 4xx range apply. While search engines recognize that crawl errors are a normal occurrence in any website, having too many of them can stunt search visibility. Crawl errors tend to look like loose ends on a site which can disrupt the proper flow of internal link equity between pages. Crawl errors are usually caused by the following occurrences: Deliberately deleted pages Accidentally deleted pages Expired dynamic pages Server issues To see how many crawl errors you have in your ecommerce site and which URLs are affected, you can access Google Search Console and go to the property concerned. At the left sidebar menu, go to Crawl>Crawl Errors. You should see a report similar to this:
Tumblr media
Depending on the types of pages you find in your crawl error report, there are several distinct ways to tackle them, including: Fix Accidentally Deleted Pages – If the URL belongs to a page that was deleted unintentionally, simply re-publishing the page under the same web address will fix the issue. Block Dynamic Page Indexing – As recommended earlier, dynamic pages that expire and become crawl errors can be prevented by blocking bot access using robots.txt. If no dynamic page is indexed in the first place, no crawl error will be detected by search engines. 301 Redirect the Old Page to the New Page– If a page was deleted deliberately and a new one was published to replace it, use a 301 redirect to lead search bots and human users to the page that took its place. This not only prevents the occurrence of a crawl error, it also passes along any link equity that the deleted page once held. However, don’t assume that the fix for every crawl error is a 301 redirect. Having too many redirects can affect site speed negatively. Address Server Issues – If server issues are the root cause of downtime, working with your web developer and your hosting service provider is your best recourse. Ignore Them – When pages are deleted deliberately but they’re of no great importance and no replacement page is planned, you can simply allow search engines to flush them out of the index in a few months. Having as few crawl errors as possible is a hallmark of responsible online store administration. A monthly check on your crawl error report should allow you to stay on top of things. SEO Guide To  fix Broken Links Broken links prevent the movement of search spiders from page to page. They’re also bad for user experience because they lead visitors to dead ends on a site. Due to the volume of pages and the complex information architectures of most ecommerce sites, it’s common for broken links to occur here and there. Broken links are usually caused either by errors in the target URLs of links or by linking to pages that are displaying 404 Not found server response codes.For smaller online stores, the free version should suffice. For sites with thousands of pages, you’ll need the paid version for a thorough scan. To check for broken links using Screaming Frog, simply set the app to function on the default Spider mode. Enter the URL of your home page and click the Start button.
Tumblr media
Wait for the crawl to finish. Depending on the number of pages and your connection speed, the crawl could take several minutes to an hour.
Tumblr media
When the crawl finishes, go to Bulk Export and click on “All Anchor Text.” You will then have to save the data as a CSV file and open it in Excel. It should look like this:
Tumblr media
Go to Column F (Status Code) and sort the values from largest to smallest. You should be able to find the broken links on top. In this case, this site only has 4 broken links. Column B (Source) refers to the page where the link can be found and edited. Column C (Destination) refers to the URL where the link is pointing to. Column E (Anchor) pertains to the text where the link on the source page is attached. (adsbygoogle = window.adsbygoogle || ).push({}); You can fix broken links through one of the following methods: Fix the Destination URL – If the destination URL was misspelled, correct the typo and set the link live. Remove the Link – If there was no clerical error on the destination URL but the page it used to link to no longer exists, simply remove the hyperlink. Replace the Link – If the link points to a page that has been deleted but there’s a replacement page or another page that can sub for it, replace the destination URL in your CMS. Fixing broken links improves the circulation of link equity and gives search engines a better impression of your site’s technical health. Duplicate Content and SEO for Your Online Store As mentioned earlier, online stores have a higher tendency to suffer from content duplication issues due to the number of products they carry and similarities in the names of their SKUs. When Google detects enough similarities between pages, it makes decisions on which pages to show in its search results. Unfortunately, Google’s choice of pages usually aren’t consistent with yours. To find pages in your ecommerce site that are possibly affected by duplication problems, go to Google Search Console and click on Search Appearance in the left sidebar. Click on the HTML Improvements report and you should see something like this:
Tumblr media
The blue linked text indicates that your site has problems with a specific type of HTML issue. In this case, it’s duplicate title tags. Clicking on this allows you to see the pages concerned. You can then export the report to a CSV file and open it in Excel. Investigate the report and the URLs involved. Analyze why title tags and meta descriptions might be duplicating. In online stores, this could be due to: Lazy Title Tag Writing – In some poorly optimized sites, web developers might leave all the pages with the same title tags. Usually, it’s the site’s brand name. This can be addressed by editing the title tags and appropriately naming each page according to the essence of its content. Very Similar Products – Some online stores have products that have very similar properties. For example, an ecommerce store that sells garments can sell a shirt that comes in 10 different colors. If each color is treated as a unique SKU and comes with its own page, the title tags and meta descriptions can be very similar. As mentioned in a previous section, using the rel=canonical HTML element can point bots to the version of these pages that you want indexed. It will also help search engines understand that the duplication in your site is by design. Accidental Duplication – In some cases, ecommerce CMS platforms could be misconfigured and run amok with page duplications. If this happens, a web developer’s help in addressing the root cause is necessary. You will likely need to delete the duplicate pages and 301 redirect them to the original. Bonus: the pages identified as having short or long title tags and meta descriptions can be dealt with simply by editing these fields and making sure length parameters are followed. More on that in the Title Tags and Meta Descriptions section of this guide. Site Speed Over the years, site speed has become one of the most important ranking factors in Google. For your online store to reach its full ranking potential, it has to load quickly for both mobile and desktop devices. In competitive keyword battles between rival online stores, the site that has the edge in site speed usually outperforms its slower competitors. To test how well your site performs in the speed department, go to Google PageSpeed insights. Copy and paste the URL of the page you want to test and hit Enter.
Tumblr media
Google will rate the page on a scale of 1-100. 85 and above are the preferred scores. In this example, the site tested was way below the ideal speed on desktop and mobile. Fortunately, Google provides technical advice on how to address the load time issues. Page compression, better caching, minifying CSS files and other techniques can greatly improve performance. Your web developer and designer will be able to help you with these improvements. Secure URLs A couple of years ago, Google announced that they’re making secure URLs a ranking factor. This compelled a lot of ecommerce sites to adopt the protocol in pursuit of organic traffic gains. While we observed the SEO benefits to be marginal, the true winners are end users who enjoy more secure shopping experiences where data security is harder to compromise.
Tumblr media
You can verify if your online store uses secure URLs by checking for the padlock icon in the address bar of your web browser. If it’s not there, you might want to consider having it implemented by your dev. To date, most ecommerce sites only have secure URLs in their checkout pages. However, more and more online store owners are making the switch to the SSL protocol. Implementing secure URLs in an ecommerce site can be a daunting task. Search engines see HTTP and HTTPS counterpart pages as two different web addresses. Therefore, you’d have to re-create every page in your site in HTTPS and 301 redirect all the old HTTP equivalents to make things work. Needless to say, this is a major decision where SEO is just one of several factors to consider. On-Page SEO On-page SEO refers to the process of optimizing individual pages for greater search visibility. This mainly involves increasing content quality and making sure keywords are present in elements of each page where they count. On-page SEO can be a huge task for big ecommerce sites but it has to be done at some point. Here are the elements of each page that you should be looking to tweak: Aside from the robots.txt file, there’s another way to restrict indexing on your pages. This is by using what’s known as meta directives tags. Simply put, these are HTML instructions within the part of each page’s source code which tell bots what they can and can’t do with a page. The most common ones are: Noindex – This tag tells search engines not to index a page. Similar to the effects of a robots.txt disallow, a page will not be listed in the SERPs if it has this tag. The difference, however, is the fact that a responsible bot will not crawl a page restricted by robots.txt while a page with noindex will still be crawled – it just won’t be indexed. In that regard, search spiders can still pass through a noindexed page’s links and link equity can still flow through those links. The noindex tag is best used for excluding blog tag pages, unoptimized category pages and checkout pages. Noarchive – This meta tag allows bots to index your page but not to keep a cached version of it in their storage. Noodp – This tag tells bots not to list a site in the Open Directory Project (DMOZ) Nofollow – This tag tells search engines that the page may be indexed but the links in it should not be followed. As discussed earlier, it’s best to be judicious when deciding which pages you should allow Google to index. Here are a few quick tips on how to handle indexing for common page types: Home Page – Allow indexing. Product Category Pages – Allow indexing. As much as possible, add breadth to these pages by enhancing the page’s copy. More on that under the “Unique Category and Product Copy” section. Product Pages – Allow indexing only if your pages have unique copy written for them. If you picked up the product descriptions from a catalog or a manufacturer’s site, disallow indexing with the “noindex,follow” meta tag. Blog Article Pages – If your online store has a blog, by all means allow search engines to index your articles. Blog Category Pages – Allow indexing only if you’ve added unique content. Otherwise, use the “noindex,follow” tag. Blog Tag Pages – Use the “noindex,follow” tag. Blog Author Archives – Use the “noindex,follow” tag. Blog Date Archives – Use the “noindex,follow” tag. Popular ecommerce platforms such as Shopify, Magento and WordPress all have built-in functionalities or plugins that allow web admins to easily manage meta directives in a page. In cases where you’re wondering why a particular page isn’t being indexed or why a supposedly restricted page is appearing in the SERPs, you can manually check their source codes. Alternatively, you can audit your entire site’s meta directives list by running a Screaming Frog crawl.
Tumblr media
After running a crawl, simply check the Meta Robots column. You should be able to see what meta directive tags each page has in your site. As with any Screaming Frog report, you can export this to an Excel sheet for easier data management. Title Tags The title tag remains the most important on-page ranking factor to most search engines. This is the text that headlines search result and its main function is to tell human searchers and bots what a page is about in 60 characters or less.
Tumblr media
Due to the brevity and importance that title tags innately have, it’s crucial for any ecommerce SEO campaign to get these right. Well-written title tags must have the following qualities to maximize a page’s ability to rank: 60 characters or less including spaces Just give the gist of what the page is about Must mention the page’s primary keyword early on Must cater to buying intent by mentioning purchase action words Optionally,mention the site’s brand Uses separators such as “-“ and “|” to demarcate the title tag’s core and the brand name. Here’s an example of a good title tag: Buy Denim Jeans Online | Men’sWear.com We see that the purchase word “Buy” is present but the product type “Denim Jeans” is still mentioned early on. The word “Online” is also mentioned to indicate that the purchase can be made over the Internet and the page won’t just tell the searcher where he or she can buy physically. The brand of the online store is also mentioned but the whole text block is still within the 60-character recommended length. SEO Guide – Meta Descriptions Meta descriptions are the text blocks that you can see under the title tags in search results. Unlike title tags, these aren’t direct ranking factors. As a matter of fact, you can get away with not writing them and Google would just pick up text from the page’s content that it deems most relevant to a query.
Tumblr media
That doesn’t mean you shouldn’t care about meta descriptions, though. When written in a well-phrased and compelling manner, these can determine whether a user clicks on your listing or goes to a competitor’s page. The meta description may not be a ranking factor but a page’s click-through rate (CTR) certainly is. CTR is an important engagement signal that Google uses to see which search results satisfy their users’ intent best. Good meta descriptions have the following qualities: Roughly 160 characters in length (including spaces) Provides users an idea of what to expect from the page Mentions the page’s main keyword at least once Makes a short but compelling case on why the user should click on the listing ALSO READ---
HOW SEARCH ENGINES WORK-GOOGLE,YAHOO,BING  TO RANK YOUR WEBSITE....
Read the full article
0 notes
deweydguinn · 7 years ago
Text
Can you predict what the future holds for your inbound links?
Source: First appeared at
Almost five years ago I wrote an article about predicting a site’s future and using your expectation to decide whether you should pursue links on that site today. Much has changed in the search engine optimization (SEO) landscape since then so I decided to expand and update my original article.
Sometimes, what’s old is old
It’s interesting to run into sites we’ve worked with in the past and compare their previous and current metrics. Lots of things pop up like:
Old links are still live but the host page is full of new links whereas it wasn’t before.
Pages that once ranked well no longer do so.
Articles with links that were not originally there have been added.
And sometimes everything is the same, though, if not better!
A look into the past
It’s easy to determine what a site looked like in the past and compare it to the current site by using Archive.org.
You may notice a lot of changes such as good and bad redesigns, deleted links and entire articles removed. Occasionally you may notice whole sites deindexed in Google:
Due diligence
When starting a link campaign, it is important to go through a number of steps or perform “due diligence” using checklists and guidelines you’ve established.
It may be impossible to check every page but try to do as much as possible so nothing is overlooked. Here are some issues to check for:
Is the site indexed in Google?
Are there any spammy hacks on the site that haven’t been fixed?
Is there contact info on the site?
Does the site rank for its brand and major keywords?
If you’re placing a link in an existing piece of content, does that page rank for its title?
Is the site free from links and ads for gambling, payday loans, drugs, and porn?
Have you checked to make sure the content is original and not scraped or duplicated?
And always, always…does it look like your link would be a natural fit and get clicked on here?
There’s more depending on the industry and individual website but notice it’s pretty uncomplicated common sense stuff.
So how in the world can you predict what’s going to happen after you finish working on the site?
How do you know the webmaster won’t fill the site up with spam, sell the domain, let it expire or sell the site to a private blog network?
There tend to be signs, both good and bad.  Let’s start with the bad signs.
Bad signs
Here are a few red flags to look for when negotiating for link placement:
The webmaster gives you a list of 50 other “great” sites he has.  While some people just own a lot of sites, it is doubtful the other 50 will be as good as the one you sought out.  Look carefully.
The webmaster asks if you mind if he gives your information to “friends” who own similar websites. Watch for heavy interlinking with the friend sites — they may possibly even be owned by the same person who’s just using aliases.
Traffic on a site has dipped dramatically in the past, even if it’s good now. If the dip was five years ago and everything has been good since then it should be OK but if you see lots of dips, especially in the past few years, that may be a sign a new drop will happen soon.
They openly advertise that they sell text links.  Big red flag here; you do not want to work with a site that is basically asking for a Google penalty.
Good signs
Now let’s look at a couple factors that distinguish sites where links live for years and everything is still looking great.
Traffic is fairly steady (or continues to increase) through the years with no major dips.
Articles are well-written, guest or sponsored posts are identified as such and don’t appear to be full of someone else’s links.
Notice the good list is shorter than the bad list. That’s because you never know what will happen. Is everyone going to eventually get hit in some way since the algorithm changes constantly? Maybe.
Disavow madness
Don’t forget some people disavow like crazy, and they don’t just disavow single webpages — they disavow entire domains, because it’s easier.
I know of sites who want to disavow upwards of 75% of their links when they don’t even have a penalty or they haven’t been negatively impacted by an algorithmic change!
Honestly, when it comes to links, anything can happen. You never know when a site will be penalized, and it’s possible for them to get caught in a wide net and not deserve it. I’ve seen unfair penalties many times and seen sites suddenly drop in rankings and never get back to where they once were, even if they did nothing wrong.
You can’t predict what will happen in link building or SEO. You can make some very educated guesses but change is the only thing you can really guarantee.
The post Can you predict what the future holds for your inbound links? appeared first on Search Engine Land.
Can you predict what the future holds for your inbound links?
0 notes